Emploi
Assistant de carrière BÊTA J'estime mon salaire
Mon CV
Mes offres
Mes alertes
Se connecter
Trouver un emploi
TYPE DE CONTRAT
Emploi CDI/CDD
Missions d'intérim Offres d'alternance
Astuces emploi Fiches entreprises Fiches métiers
Rechercher

Pre-phd student - action detection for improving autism diagnosis h/f

Nice
CDD
Inria
1 801 € par mois
Publiée le Il y a 15 h
Description de l'offre

A propos d'Inria

Inria est l'institut national de recherche dédié aux sciences et technologies du numérique. Il emploie 2600 personnes. Ses 215 équipes-projets agiles, en général communes avec des partenaires académiques, impliquent plus de 3900 scientifiques pour relever les défis du numérique, souvent à l'interface d'autres disciplines. L'institut fait appel à de nombreux talents dans plus d'une quarantaine de métiers différents. 900 personnels d'appui à la recherche et à l'innovation contribuent à faire émerger et grandir des projets scientifiques ou entrepreneuriaux qui impactent le monde. Inria travaille avec de nombreuses entreprises et a accompagné la création de plus de 200 start-up. L'institut s'eorce ainsi de répondre aux enjeux de la transformation numérique de la science, de la société et de l'économie. pre-PhD student - Action Detection for improving autism diagnosis
Le descriptif de l'offre ci-dessous est en Anglais
Type de contrat : CDD

Contrat renouvelable : Oui

Niveau de diplôme exigé : Bac +5 ou équivalent

Fonction : Chercheur contractuel

A propos du centre ou de la direction fonctionnelle

The Inria centre at Université Côte d'Azur includes 42 research teams and 9 support services. The centre's staff (about 500 people) is made up of scientists of dierent nationalities, engineers, technicians and administrative staff. The teams are mainly located on the university campuses of Sophia Antipolis and Nice as well as Montpellier, in close collaboration with research and higher education laboratories and establishments (Université Côte d'Azur, CNRS, INRAE, INSERM ), but also with the regiona economic players.

With a presence in the fields of computational neuroscience and biology, data science and modeling, software engineering and certification, as well as collaborative robotics, the Inria Centre at Université Côte d'Azur is a major player in terms of scientific excellence through its results and collaborations at both European and international levels.

Contexte et atouts du poste

Inria, the French National Institute for Computer Science and Applied Mathematics, promotes scientific excellence for technology transfer and society. Graduates from the world's top universities, Inria's 2,700 employees rise to the challenges of digital sciences. With its open, agile model, Inria can explore original approaches with its partners in industry and academia and provide an efficient response to the multidisciplinary and application challenges of digital transformation. Inria is the source of many innovations that add value and create jobs.

Team

The STARS research team combines advanced theory with cutting-edge practice, with a focus on computer vision systems.

Team website:

Scientific context

Actions speak louder than words. Humans are complex beings, and they often convey a wealth of information not through their words but through their actions and demeanor. Non-verbal behaviors can offer crucial insights into their emotional state, pain level, or anxiety, often more eloquently than words alone [1]. The analysis of non-verbal communication is of critical importance in the diagnostic landscape. Let us imagine toddlers who struggle to describe the intensity of their abdominal pain. However, they non-verbally express their agony by tightly gripping their abdomen, thus indicating the severity. The inability to express pain verbally due to cognitive limitations, language discrepancies, or emotional distress can be compensated for by analyzing non-verbal cues effectively.

However, decoding non-verbal cues in a clinical setting is not a straightforward task. It relies heavily on a high degree of inference. It requires healthcare professionals to be astute observers, picking up on nuances that may be subtle yet critical. For instance, a slight furrow of the brow could signify discomfort or concern, and a patient's posture may reveal signs of tension or distress. The challenge lies in accurately interpreting these cues, as they can vary greatly from one individual to another.

To address this challenge, automated systems capable of detecting non-verbal behaviors and their corresponding meanings can assist healthcare providers. Such technology acts as a supportive tool for medical experts, enhancing patient assessments and ensuring that critical information is not overlooked or misunderstood. In essence, recognizing and interpreting non-verbal signs is essential for holistic patient care, and advanced technology can augment its accuracy and effectiveness, leading to improved diagnosis and treatment outcomes.

Mission confiée

The primary objective of this research is to lead the development of an advanced AI model for Human Behavior Understanding [2] to identify non-verbal cues expressed by patients, and then interpret the cues to derive critical insights about their health. Traditionally, computer vision methodologies encompassing skin color analysis, shape analysis, pixel intensity examination, and anisotropic diffusion were used to identify body parts and trace their activities. However, these algorithms provided limited flexibility because of their domain-specific nature. Deep learning methods can be used to deal with this issue as they offer more training flexibility and better performance results. The overarching goal is to provide a real-time, data-driven analysis

of non-verbal cues exhibited by patients during clinical interactions, thereby delivering invaluable insights to healthcare practitioners.

Approach:

In this work, we aim to develop a novel data-driven, deep learning model to analyze the behaviors of patients during clinical interactions. These interactions can be in the form of single-view videos, which would contain comprehensive information about the overall behavior of a patient being examined. Traditional object detection-based approaches are centered on the two-shot object detection methodology. This method deals with identifying the regions that contain the object, and then refining the region proposal information to perform multiclass classification [3]. A better approach to this could be using a single-shot detection algorithm, as was proposed through YOLO [4]. However, while creating bounding boxes around the desired objects and predicting class probabilities is useful, it is not enough to capture the dynamics of the non-verbal cues and their clinical interpretation.

To deal with this problem, novel foundation models [5] could be leveraged to encode visual features in the videos and perform resilient behavior tracking and understanding. We propose a self-supervised transformer model for semantically segmenting the body parts, tracking their real-time location, obtaining critical positional and behavioral information, and decoding them to perform thorough clinical analysis through a linear classification backbone [8]. The self-supervision would enable the model to learn independently despite having scarce data, and provide optimum prediction results in a computationally efficient wireframe.

By the culmination of this work, the aspiration is to contribute substantially to the advancement of an AI system that augments clinical communication by offering a technically refined analysis of non-verbal cues. This undertaking not only bears the potential to enhance medical diagnostics but also extends its applicability to diverse domains necessitating comprehensive non-verbal behavior analysis, including human-computer interaction paradigms and scholarly research in the realm of psychology [10]. This work will be conducted within the CoBTek team from Nice Hospital, which is specialized in clinical trials for autistic children.

The project is based on filmed assessments recorded as part of current clinical practice at the Centre Ressources Autisme des Hôpitaux Pédiatriques Nice CHU-Lenval, linked to the CoBTek laboratory. The research is part of the ANR ACTIVIS project. The project brings together multimodal skills in AI, body movement analysis, and linguistics.

Principales activités

The Inria STARS team is seeking a pre-PhD student with a strong background in computer vision, deep learning, and video understanding.

1-2 months:

Establish benchmarking datasets

Evaluate benchmark datasets on existing models

3-4 months:

Propose the novel self-supervised model

Evaluate the proposed model on benchmarking datasets

5-6 months:

Optimize the proposed algorithm for real-world scenarios

Write a paper

Compétences

Candidates must hold a Master's or Engineering degree or equivalent in Computer Science or a closely related discipline by the start date.

The candidate must be grounded in computer vision basics and have solid mathematical and programming skills.

With theoretical knowledge in Computer Vision, OpenCV, Mathematics, Deep Learning (PyTorch, TensorFlow), and technical background in C++ and Python programming, and Linux.

The candidate must be committed to scientific research and substantial publications.

In order to protect its scientific and technological assets, Inria is a restricted-access establishment. Consequently, it follows special regulations for welcoming any person who wishes to work with the institute. The final acceptance of each candidate thus depends on applying this security and defense procedure.

Avantages

- Subsidized meals
- Partial reimbursement of public transport costs
- Leave: 7 weeks of annual leave + 10 extra days off due to RTT (statutory reduction in working hours) + possibility of exceptional leave (sick children, moving home, etc.)
- Possibility of teleworking and flexible organization of working hours
- Professional equipment available (videoconferencing, loan of computer equipment, etc.)
- Social, cultural and sports events and activities
- Access to vocational training
- Contribution to mutual insurance (subject to conditions)

Rémunération

1801€ gross per month

Postuler
Créer une alerte
Alerte activée
Sauvegardée
Sauvegarder
Offre similaire
Post-doctoral research visit f - m computational design of next-generation optical metasurfaces h/f
Nice
CDD
Inria
Design
Offre similaire
Development and application of a software platform for nanophotonics h/f
Nice
CDD
Inria
Offre similaire
Ingénieur - etude des changements de forme durant le développement application à l'embryogénèse de l'oursin et au développement d'organoides h/f
Nice
CDD
Inria
Ingénieur d'études
2 692 € par mois
Voir plus d'offres d'emploi
Estimer mon salaire
JE DÉPOSE MON CV

En cliquant sur "JE DÉPOSE MON CV", vous acceptez nos CGU et déclarez avoir pris connaissance de la politique de protection des données du site jobijoba.com.

Offres similaires
Recrutement Inria
Emploi Inria à Nice
Emploi Nice
Emploi Alpes-Maritimes
Emploi Provence-Alpes-Côte d'Azur
Intérim Nice
Intérim Alpes-Maritimes
Intérim Provence-Alpes-Côte d'Azur
Accueil > Emploi > Pre-Phd Student - Action Detection For Improving Autism Diagnosis H/F

Jobijoba

  • Conseils emploi
  • Avis Entreprise

Trouvez des offres

  • Emplois par métier
  • Emplois par secteur
  • Emplois par société
  • Emplois par localité
  • Emplois par mots clés
  • Missions Intérim
  • Emploi Alternance

Contact / Partenariats

  • Contactez-nous
  • Publiez vos offres sur Jobijoba
  • Programme d'affiliation

Suivez Jobijoba sur  Linkedin

Mentions légales - Conditions générales d'utilisation - Politique de confidentialité - Gérer mes cookies - Accessibilité : Non conforme

© 2025 Jobijoba - Tous Droits Réservés

Les informations recueillies dans ce formulaire font l’objet d’un traitement informatique destiné à Jobijoba SA. Conformément à la loi « informatique et libertés » du 6 janvier 1978 modifiée, vous disposez d’un droit d’accès et de rectification aux informations qui vous concernent. Vous pouvez également, pour des motifs légitimes, vous opposer au traitement des données vous concernant. Pour en savoir plus, consultez vos droits sur le site de la CNIL.

Postuler
Créer une alerte
Alerte activée
Sauvegardée
Sauvegarder