Dienst van SURF
© 2025 SURF
Communication between healthcare professionals and deaf patients has been particularly challenging during the COVID-19 pandemic. We have explored the possibility to automatically translate phrases that are frequently used in the diagnosis and treatment of hospital patients, in particular phrases related to COVID-19, from Dutch or English to Dutch Sign Language (NGT). The prototype system we developed displays translations either by means of pre-recorded videos featuring a deaf human signer (for a limited number of sentences) or by means of animations featuring a computer-generated signing avatar (for a larger, though still restricted number of sentences). We evaluated the comprehensibility of the signing avatar, as compared to the human signer. We found that, while individual signs are recognized correctly when signed by the avatar almost as frequently as when signed by a human, sentence comprehension rates and clarity scores for the avatar are substantially lower than for the human signer. We identify a number of concrete limitations of the JASigning avatar engine that underlies our system. Namely, the engine currently does not offer sufficient control over mouth shapes, the relative speed and intensity of signs in a sentence (prosody), and transitions between signs. These limitations need to be overcome in future work for the engine to become usable in practice.
In this report, the details of an investigation into the eect of the low induction wind turbines on the Levelised Cost of Electricity (LCoE) in a 1GW oshore wind farm is outlined. The 10 MW INNWIND.EU conventional wind turbine and its low induction variant, the 10 MW AVATAR wind turbine, are considered in a variety of 10x10 layout configurations. The Annual Energy Production (AEP) and cost of electrical infrastructure were determined using two in-house ECN software tools, namely FarmFlow and EEFarm II. Combining this information with a generalised cost model, the LCoE from these layouts were determined. The optimum LCoE for the AVATAR wind farm was determined to be 92.15 e/MWh while for the INNWIND.EU wind farm it was 93.85 e/MWh. Although the low induction wind farm oered a marginally lower LCoE, it should not be considered as definitive due to simple nature of the cost model used. The results do indicate that the AVATAR wind farms require less space to achieve this similar cost performace, with a higher optimal wind farm power density (WFPD) of 3.7 MW/km2 compared to 3 MW/km2 for the INNWIND.EU based wind farm.
In this project, the AGM R&D team developed and refined the use of a facial scanning rig. The rig is a physical device comprising multiple cameras and lighting that are mounted on scaffolding around a 'scanning volume'. This is an area at which objects are placed before being photographed from multiple angles. The object is typically a person's head, but it can be anything of this approximate size. Software compares the photographs to create a digital 3D recreation - this process is called photogrammetry. The 3D model is then processed by further pieces of software and eventually becomes a face that can be animated inside in Unreal Engine, which is a popular piece of game development software made by the company Epic. This project was funded by Epic's 'Megagrant' system, and the focus of the work is on streamlining and automating the processing pipeline, and on improving the quality of the resulting output. Additional work has been done on skin shaders (simulating the quality of real skin in a digital form) and the use of AI to re/create lifelike hair styles. The R&D work has produced significant savings in regards to the processing time and the quality of facial scans, has produced a system that has benefitted the educational offering of BUas, and has attracted collaborators from the commercial entertainment/simulation industries. This work complements and extends previous work done on the VIBE project, where the focus was on creating lifelike human avatars for the medical industry.
Deze aanvraag beoogt de ontwikkeling van een virtuele patiënt die psychologen in opleiding een platform biedt om hun soft skills te trainen in een breed scala aan situaties. Ontwikkeling hiervan biedt betrokken MKB de mogelijkheid onderzoek en ontwikkeling te doen naar zo’n platform en hun expertise in AI te verrijken Momenteel is de praktijk van soft skills training hoofdzakelijk gericht op rollenspellen met medestudenten, maar in een beperkt aantal scenario’s. Dit betekent dat studenten niet adequaat worden voorbereid op de complexiteit van de klinische praktijk. Hoewel er al langer gezocht wordt naar alternatieven, zoals oefenen met acteurs, blijkt dit in de praktijk moeilijk te realiseren en financieel belastend. Hierdoor worden studenten onvoldoende blootgesteld aan de realiteit van het beroepenveld waarin ze zullen werken. Voor een authentiekere wisselwerking tussen de therapeut in opleiding en de virtuele patiënt, is het belangrijk dat er emoties worden getoond tijdens de gesprekken. Daarom onderzoeken we de mogelijkheden van automatische emotieherkenning in dialooginteracties, zodat de virtuele patiënt in staat is om emoties in zijn taalgebruik te integreren. Dit stelt de therapeut in opleiding in staat om een diagnose te stellen gebaseerd op zowel de inhoud van het gesprek als de uitgedrukte emoties. Door interactie met de virtuele patiënt kunnen studenten experimenteren met diverse situaties, reacties en persoonlijkheden, wat een realistischere trainingsomgeving biedt. Deze aanpak vult niet alleen het bestaande tekort aan diverse oefenervaringen aan, maar biedt ook een flexibele en schaalbare oplossing die eenvoudig kan worden geïntegreerd in bestaande lesprogramma's. Het resultaat is een verbeterde voorbereiding van studenten op hun toekomstige professionele praktijk. Virtuele patiënten kunnen deze interactieve ervaringen mogelijk maken. Hiermee wordt ingegaan op de wens van betrokken MKB zich verder te verdiepen in AI technologie die de zorgsector ten goede komt.