How AI, smartphones and satellites together help us better understand global ecosystems
Freiburg, 04/02/2026
An international research team headed by the University of Freiburg shows the potential offered by combining citizen science, AI and satellite data. The researchers used millions of plant observations by people worldwide and combined them with satellite, climate and soil data. This enabled them to train AI models to recognise plant characteristics directly from photographs and create the most-precise global maps of the plant world yet.

How high plants grow, how large their leaves are and how much nitrogen they contain all influence key ecosystem processes such as the storage of carbon or vegetation’s resilience in the face of climate change. Plant characteristics such as these, which researchers describe as ‘functional traits’, have until now usually only been measured directly on-site in specific locations. Two new international studies headed by a research unit at the University of Freiburg’s Faculty of Environment and Natural Resources (UNR) have now shown how these gaps in our knowledge can be remedied using citizen science data. Both studies use observations and smartphone photographs made by members of the public, but each takes a different approach in other regards.
“Citizen science opens up new opportunities to us to record plant characteristics worldwide and comparably,” says Prof. Dr. Teja Kattenborn, Professor of Sensor-based Geoinformatics at the University of Freiburg and initiator/co-author of both studies. “When we systematically combine crowdsourced observations with environmental and remote sensing data or using Artificial Intelligence methods we can uncover ecological patterns that can hardly be measured just using classic field studies. This is essential for better understanding terrestrial ecosystems worldwide and modelling them more reliably.”
New method yields high-resolution maps of the plant world
For the study published in the journal Nature Communications ‘Crowdsourced biodiversity monitoring fills gaps in global trait mapping’, the researchers combined millions of species observations from citizen science platforms such as iNaturalist with professional photographs of vegetation. In addition, they drew on plant trait measurement data from international databases and high-resolution information from earth observations, such as about climate, soil and vegetation structure.
Using machine learning, this big data approach makes it possible to deduce how properties such as leaf size, nutrient content or growth height of plant communities are distributed worldwide. The result was that the combined data supplied far more reliable, more extensive and spatially detailed maps than previous processes.
“We were able in this way to create the first global maps for most plant characteristics,” explains Daniel Lusk, first author of the studyand research associate at the Chair of Sensor-based Geoinformatics at the University of Freiburg. “Our approach offers marked improvements, especially in regions where until now there have been hardly any measurements available. We’ve therefore created an essential basis for the further development of global vegetation and climate models.”
The maps can be viewed online on the platform Global Trait Maps:
AI analyses crowdsourced smartphone photographs
A complementary, more AI-based approach is taken by the study ‘PlantTraitNet: An Uncertainty-Aware Multimodal Framework for Global-Scale Plant Trait Inference from Citizen Science Data’. The researchers developed a process that derives functional plant characteristics directly from smartphone photographs that people have taken worldwide and uploaded to platforms such as iNaturalist or Pl@ntNet. An AI program recognises patterns in the images and from that estimates features such as plant height, leaf area, specific leaf area and nitrogen content of the plants. In addition, the AI automatically retrieves climate information based on the GPS coordinates embedded in the photographs. The individual predictions from thousands of photographs are then compiled spatially to produce global maps of key plant characteristics. The AI system also takes account of image quality and other uncertainty factors, to weight the individual observations appropriately.
“Our results show that ecological information can actually be recognised in everyday photographs,” says Ayushi Sharma, first author of the study and research assistant at the University of Freiburg’s Chair of Sensor-based Geoinformatics. “When a lot of people photograph plants, this produces a common data set that enables new insights into global vegetation.”
The study has been recognised internationally for its approach: at the 40th Annual AAAI Conference on Artificial Intelligence (AAAI‑26) it won the award for best paper in the category Social Impact.
Further Information
Original publication: Lusk, D. et al. (2026) Crowdsourced biodiversity monitoring fills gaps in global trait mapping. Nature Communications. DOI: 10.1038/s41467-026-68996-y
Original publication: Sharma, A. et al. (2026) PlantTraitNet: An Uncertainty-Aware Multimodal Framework for Global-Scale Plant Trait Inference from Citizen Science Data. AAAI 2026. DOI: 10.48550/arXiv.2511.06943
Daniel Lusk is a research associate at the Chair of Sensor-based Geoinformatics (geosense) at the University of Freiburg. Among other things he studies how global species diversity and plant characteristics can be better analysed using citizen science, remote sensing data and machine learning.
Ayushi Sharma is a research assistant at the Chair of Sensor-based Geoinformatics (geosense) at the University of Freiburg. She studies the use of image data and Artificial Intelligence to record and understand species diversity better.
Prof. Dr. Teja Kattenborn is the Professor of Sensor-based Geoinformatics (geosense) at the University of Freiburg’s UNR and principal investigator for the Cluster of Excellence Future Forests.