Internship | Generative AI to improve a military vehicle detector trained on simulated data
About this position
Deep learning has emerged as a powerful tool for image analysis in various fields including the military domain. It has the potential to automate and enhance tasks such as object detection, classification, and tracking. Training images for development of such models are typically scarce, due to the restricted nature of this type of data. Consequently, researchers have focused on using synthetic data for model development, since simulated images are fast to generate and can, in theory, make up a large and diverse data set. However, when a deep learning model is trained with simulated data and evaluated with real data there is typically a disparity in performance. This synthetic-to-real gap follows from differences between the synthetic and real domain, for example variations in lighting, textures, perspectives, and environmental conditions, which may not be accurately represented in the simulated training data.
What will be your role?
Bridging the gap between simulated and real data can be done by aligning the distributions of both domains. This can for example be done by augmenting the simulated images such that the object appearance varies more widely or that they become more similar to the real data. Recent techniques focus on image-to-image translation using generative AI, including GANs [Duplevska, 2022], Diffusion models [Saharia, 2023]. Other options include the use of large language models (LLM), Stable Diffusion, or diffusion-based inpainting.
Within this project we previously investigated the effect of simulation variety for the development of a deep learning-based military vehicle detection model [Eker, 2023]. We evaluate on real-world images and current research focuses on measuring the alignment between simulated and real data, and joint training (synthetic + real data). The goal of the student assignment is to improve military object detection by using state-of-the-art generative deep learning methods to close the synthetic-to-real gap for simulated images of military vehicles. Depending on the student’s interests, we can combine this with a more fundamental question (e.g. measuring similarity with real data) or focus on improving generative methods (e.g. by prompt tuning). In addition, we can look into zero-shot learning with generative methods that do not require any synthetic images as input, such as Dall-E.
Keywords for this project are: Synthetic-to-real-gap, object detection, image-to-image translation, generative AI, military vehicles
[Duplevska, 2022] Duplevska, D., Ivanovs, M., Arents, J., & Kadikis, R. (2022, September). Sim2Real image translation to improve a synthetic dataset for a bin picking task. In 2022 IEEE 27th International Conference on Emerging Technologies and Factory Automation (ETFA) (pp. 1-7). IEEE.
[Saharia, 2023] Saharia, C., Chan, W., Chang, H., Lee, C., Ho, J., Salimans, T., & Norouzi, M. (2022, July). Palette: Image-to-image diffusion models. In ACM SIGGRAPH 2022 Conference Proceedings (pp. 1-10).
[Eker, 2023] Eker, T. A., Heslinga, F. G., Ballan, L., den Hollander, R. J., & Schutte, K. (2023, October). The effect of simulation variety on a deep learning-based military vehicle detector. In Artificial Intelligence for Security and Defence Applications (Vol. 12742, pp. 183-196). SPIE.
What we expect from you
You are in the final stages of your Master's degree in artificial intelligence, computer science, physics, mathematics, electrical engineering, systems and control engineering, or a similar degree and have some track record in the field of computer vision. Please mention in your cover letter whether you are looking for a graduation project or an internship. Also indicate when you would like to start and what the preferred duration of the internship/graduation project would be.
What you'll get in return
You want to work on the precursor of your career; a work placement gives you an opportunity to take a good look at your prospective future employer. TNO goes a step further. It’s not just looking that interests us; you and your knowledge are essential to our innovation. That’s why we attach a great deal of value to your personal and professional development. You will, of course, be properly supervised during your work placement and be given the scope for you to get the best out of yourself. Naturally, we provide suitable work placement compensation.
TNO as an employer
At TNO, we innovate for a healthier, safer and more sustainable life. And for a strong economy. Since 1932, we have been making knowledge and technology available for the common good. We find each other in wonder and ingenuity. We are driven to push boundaries. There is all the space and support for your talent and ambition. You work with people who will challenge you: who inspire you and want to learn from you. Our state-of-the-art facilities are there to realize your vision. What you do at TNO matters: impact makes the difference. Because with every innovation you contribute to tomorrow’s world. Read more about TNO as an employer.
At TNO we encourage an inclusive work environment, where you can be yourself. Whatever your story and whatever unique qualities you bring to the table. It is by combining our unique strengths and perspectives that we are able to develop innovations that make a real difference in society. Want to know more? Read what steps we are taking in the area of diversity and inclusion.
The selection process
After the first CV selection, the application process will be conducted by the concerning department. TNO will provide a suitable internship agreement. If you have any questions about this vacancy, you can contact the contact person mentioned below.
Students must reside in the Netherlands before the start and also throughout the internship or graduation project at TNO.
Has this job opening sparked your interest?
Then we’d like to hear from you! Please contact us for more information about the job or the selection process. To apply, please upload your CV and covering letter using the ‘apply now’ button.