Natural Language Processing combats manual text analysis
We’re constantly collecting more data, for example from camera images and text documents. This can provide us with relevant information. However, data is not always stored in a structured manner. This makes it difficult to retrieve the relevant information. Natural Language Processing (NLP) is an AI technique that tackles this problem.
What is natural language processing?
NLP combines the techniques of statistics with machine learning. This makes it possible to extract keywords from a text. We can then use this to make important classifications. TNO uses NLP to extract information from extensive, unstructured textual data in a more automated way.
TNO automatically creates taxonomies with natural language processing
You can use jargon to better streamline and standardise processes, for example in the form of a taxonomy or ontology. However, matching jargon within a field is a time-consuming exercise.
TNO uses NLP to identify important terms from a set of documents and determine their mutual relationships. We do this by:
- combining syntactic information (sentence construction)
- keyword extraction
- web sources
- semantic embedding methods
The taxonomy can then be used as input for an expert session.
Natural language processing is relevant for trend prediction
At TNO, we use our tools to automatically extract information from documents. We can also make predictions, such as in the foresight domain. Using the Horizon Scanner, we explore and extract from relevant websites, blogs and documents. This allows us to retrieve relevant information and to show trends.
Trend analysis shows us that the term deep learning is now being mentioned much more frequently within the computer vision domain than it was ten years ago. In addition, we can classify the documents automatically. For example, by a particular topic or field. We can also use blogs to conduct sentiment analysis and find out whether terms are being described more positively or negatively.
Timon BrussaardFunctie:Senior Business Development Manager
Timon Brussaard is the Business Director at Leibniz Institute and Senior Business Development Manager at TNO. With a background in economics and change management, for the past 18 years, Timon has worked on business development, research, and consultancy processes in the rapidly changing ICT market. His expertise includes, norm engineering, information provision, AI, leadership, project management, communication, policy skills and marketing.
Christopher BrewsterFunctie:Senior scientist
Christopher Brewster is a Senior Scientist in the Data Science group and Professor of the Application of Emerging Technologies in the Institute of Data Science, Maastricht University. His research has focussed on the application of Semantic Technologies, Open and Linked Data, interoperability architectures and Data Governance, mostly to the food and agriculture domains.
Daniël WormFunctie:Senior consultant
Jok TangFunctie:Deputy Research Manager Data Science
Joris SijsFunction not known
Looking for another expert?View all experts
AI Systems Engineering & Lifecycle Management
The AI system for the future. At TNO, we work on AI systems that remain reliable and can handle new functions in the future.
You can read about how AI is educated in Chapter 1. How can we make clear to AI which goals we want to pursue as humans? Andhow can we ensure intelligent systems will always function in service of society?
Innovation with AI
What does that world look like in concrete terms? Using numerous examples, TNO has created a prognosis for the future in Chapter 2. Regarding construction, for example, in which AI will be used to check the quality, safety, and energy efficiency of buildings before they are actually built. Or healthcare, where robots will partly take over caregivers’ tasks and AI will be able to autonomously develop medicines.
Innovating with innovation AI
How AI will change research itself is explained in Chapter 3. For example, what role will AI be permitted to play in knowledge sharing? And what will happen when we make machines work with insurmountably large data sets?
David Deutsch on the development and application of AI
Peter Werkhoven, chief scientific officer at TNO, joins physicist, Oxford professor, and pioneer in the field of quantum computing, David Deutsch, for a virtual discussion. Deutsch set out his vision in 1997 in the book, The Fabric of Reality. Together, they talk about the significance of quantum computing for the development and application of AI. Will AI ever be able to generate ‘explained knowledge’ or learn about ethics from humans?