{rfName}
De

License and Use

Icono OpenAccess

Altmetrics

Analysis of institutional authors

Daza, RobertoCorresponding AuthorGomez, Luis FAuthorFierrez, JulianAuthorMorales, AythamiAuthorTolosana, RubenAuthorOrtega-Garcia, JavierAuthor

Share

February 17, 2025
Publications
>
Article

DeepFace-Attention: Multimodal Face Biometrics for Attention Estimation With Application to e-Learning

Publicated to: IEEE Access. 12 111343-111359 - 2024-01-01 12(), DOI: 10.1109/ACCESS.2024.3437291

Authors: Daza, Roberto; Gomez, Luis F; Fierrez, Julian; Morales, Aythami; Tolosana, Ruben; Ortega-Garcia, Javier

Affiliations

Univ Autonoma Madrid, Biometr & Data Pattern Analyt Lab, Campus Cantoblanco, Madrid 28049, Spain - Author

Abstract

This work introduces an innovative method for estimating attention levels (cognitive load) using an ensemble of facial analysis techniques applied to webcam videos. Our method is particularly useful, among others, in e-learning applications, so we trained, evaluated, and compared our approach on the mEBAL2 database, a public multi-modal database acquired in an e-learning environment. mEBAL2 comprises data from 60 users who performed 8 different tasks. These tasks varied in difficulty, leading to changes in their cognitive loads. Our approach adapts state-of-the-art facial analysis technologies to quantify the users' cognitive load in the form of high or low attention. Several behavioral signals and physiological processes related to the cognitive load are used, such as eyeblink, heart rate, facial action units, and head pose, among others. Furthermore, we conduct a study to understand which individual features obtain better results, the most efficient combinations, explore local and global features, and how temporary time intervals affect attention level estimation, among other aspects. We find that global facial features are more appropriate for multimodal systems using score-level fusion, particularly as the temporal window increases. On the other hand, local features are more suitable for fusion through neural network training with score-level fusion approaches. Our method outperforms existing state-of-the-art accuracies using the public mEBAL2 benchmark.

Keywords

Attention estimationBehavioral analysisBehavioral sciencesBlink rateCognitive loadCognitive processesDatabasesDeep learningE-learningElectroencephalographyElectronic learningEstimationExpressionsEyeblinkFace recognitionFacial action unitsFusionHead pose detectionHeart rate detectionMulti-modal learninMulti-modal learningMultimodal sensorsMultiple classifiersPose estimationSysteTask analysis

Quality index

Bibliometric impact. Analysis of the contribution and dissemination channel

The work has been published in the journal IEEE Access due to its progression and the good impact it has achieved in recent years, according to the agency Scopus (SJR), it has become a reference in its field. In the year of publication of the work, 2024 there are still no calculated indicators, but in 2023, it was in position , thus managing to position itself as a Q1 (Primer Cuartil), in the category Engineering (Miscellaneous).

Independientemente del impacto esperado determinado por el canal de difusión, es importante destacar el impacto real observado de la propia aportación.

Según las diferentes agencias de indexación, el número de citas acumuladas por esta publicación hasta la fecha 2025-12-13:

  • WoS: 2
  • Scopus: 6

Impact and social visibility

From the perspective of influence or social adoption, and based on metrics associated with mentions and interactions provided by agencies specializing in calculating the so-called "Alternative or Social Metrics," we can highlight as of 2025-12-13:

  • The use, from an academic perspective evidenced by the Altmetric agency indicator referring to aggregations made by the personal bibliographic manager Mendeley, gives us a total of: 14.
  • The use of this contribution in bookmarks, code forks, additions to favorite lists for recurrent reading, as well as general views, indicates that someone is using the publication as a basis for their current work. This may be a notable indicator of future more formal and academic citations. This claim is supported by the result of the "Capture" indicator, which yields a total of: 14 (PlumX).

With a more dissemination-oriented intent and targeting more general audiences, we can observe other more global scores such as:

  • The Total Score from Altmetric: 2.
  • The number of mentions on the social network X (formerly Twitter): 2 (Altmetric).

It is essential to present evidence supporting full alignment with institutional principles and guidelines on Open Science and the Conservation and Dissemination of Intellectual Heritage. A clear example of this is:

  • The work has been submitted to a journal whose editorial policy allows open Open Access publication.
  • Assignment of a Handle/URN as an identifier within the deposit in the Institutional Repository: https://repositorio.uam.es/handle/10486/715742

Leadership analysis of institutional authors

There is a significant leadership presence as some of the institution’s authors appear as the first or last signer, detailed as follows: First Author (DAZA GARCIA, ROBERTO) and Last Author (ORTEGA GARCIA, JAVIER).

the author responsible for correspondence tasks has been DAZA GARCIA, ROBERTO.

Awards linked to the item

This work was supported in part by project HumanCAIC under Grant TED2021-131787B-I00 MICINN; in part by project BBforTAI under Grant PID2021-127641OB-I00 MICINN/FEDER; in part by project BIO-PROCTORING (GNOSS Program, Agreement Ministerio de Defensa-UAM-FUAM dated 29-03-2022); in part by the Catedra ENIA UAM-VERIDAS en IA Responsable (NextGenerationEU PRTR) under Grant TSI-100927-2023-2; and in part by the Autonomous Community of Madrid. The work of Roberto Daza was supported by the FPI Fellowship from MINECO/FEDER. The work of Aythami Morales was supported by the Madrid Government (Comunidad de Madrid-Spain) under the Multiannual Agreement with Universidad Autonoma de Madrid in the line of Excellence for the University Teaching Staff in the context of the Regional Program of Research and Technological Innovation (V PRICIT).