Software Supporting Science.
TechTraverse is…
A small team of dedicated software engineers with outsized expertise in managing and providing access to Earth Science data via cloud-native architectures.
Our close relationship with the National Oceanic and Atmospheric Administration (NOAA) provides us important insight and opportunities, and our dedication to open development and collaboration are critical for enabling our users in the scientific community.
Our work…
A New Way of Detecting Wildfires
Background: Wildfires are among the most destructive natural disasters, causing significant damage to ecosystems, human life, property, and air quality. As climate change intensifies, leading to rising temperatures, declining snowpack, and more frequent droughts, the severity of wildfires has escalated globally, making early detection and accurate forecasting increasingly essential.
In response to this growing threat NOAA is collaborating with the University of Wisconsin Space Science and Engineering Center (SSEC) to develop a Next Generation Fire System (NGFS)
Our Job: Build a responsive, performant web portal providing rapid and critical access to wildfire space and ground based observations, models, warnings, and forecasts, relevant to diverse user communities including scientific researchers, weather forecasters, emergency responders, educators, policy makers, citizen scientists, and the general public.
Using scalable, distributed, cloud-based technologies, and embracing FAIR, CARE, and DEIA data principles, we are building this fire detection and data access system. By leveraging the newest generation of API specifications from the Open Geospatial Consortium (OGC) as well as distributed, cloud-optimized data formats such as GeoParquet and Cloud Optimized GeoTIFF (COG), we are also laying the foundation for reusable data access services to empower a variety NOAA’s future web-based user interfaces as well as GIS clients and user-developed software.
Creating an Open Science Knowledge Mesh
Background: The size and scope of scientific data is increasing rapidly. Of course the number of bytes being recorded and stored continues to grow exponentially, but perhaps even more challenging is handling the ever-increasing diversity of those data and the relationships between them.
Enabling the long-term preservation of these data, capturing the inputs and outputs of the many processes run against them, and exposing them with full dimensionality, attribution, and provenance to the scientific community is required to fully leverage them. This is the goal of the Open Information Stewardship System, being developed by NOAA’s National Centers for Environmental Information in collaboration with TechTraverse.
Our Job: Implement a data model and corresponding managed workflow system that is flexible enough to accommodate any process on any kind of data while fully capturing the history of those processes and data as they flow through it. Expose the resulting data - but also critically its contextualized, linked metadata - to the public, enabling not only data discovery but also full attribution for its sources, provenance describing how it may have been produced or transformed, and semantic relationships that it has with other data sets.
No small task, to be sure. But one which is worth undertaking to enable the creation of new knowledge from such a vast and diverse corpus of scientific data.
Leveraging the Knowledge Mesh With AI
Background: If an Open Science Knowledge Mesh exists… what can we do with it? How do we leverage data sets themselves and the knowledge graph connecting them to make new discoveries and produce more knowledge? Exploring the many answers to these questions is the goal of NOAA’s Broad Agency Announcement (BAA) entitled “A Study to Determine Natural Language Processing (NLP) Capabilities with the NCCF Open Knowledge Mesh (KM)”
Our Job: In collaboration with Element 84, TechTraverse will participate in the BAA by performing a study aimed at leveraging the Open Knowledge Mesh with a suite of modern AI techniques, enabling new ways of discovering, querying, analyzing, and deriving knowledge from data in the mesh.
Just about everything’s on the table here: LLMs, RAG, Agents… Any of the emerging artificial intelligence tools and techniques for interacting with users and producing useful responses, with the critical requirement that those responses must be based in fact. In this domain, generating text that looks like other textual training data simply does not suffice: users like scientists, policymakers, or educators need to know that their results are based on observed or derived data and where those data came from. The knowledge mesh gives us a way to trace that information and our job is to make it easily available to users.
We’d love to hear from you!
Reach out via email: hello@techtraverse.io