At Element 84, our team looks forward to the Spring season and the return of a busy conference schedule every year. Our time attending, presenting, and learning at conferences helps us to feel connected and plugged into trends in the geospatial industry and beyond. It also allows us to receive feedback on our team’s latest projects. Over the past few months we’ve made an effort to connect with a variety of attendees during our conference cycle to learn more about what data users and providers are lacking in their work. In other words, where are the biggest gaps in Earth Observation (EO) data and the EO field more generally?
For each conference that we sent E84-ers to this season, we tasked them with asking the same question: “In your opinion, what are the biggest gaps you encounter working with EO data?” And, as would likely happen with any open-ended question, we received a range of answers that sometimes presented conflicting perspectives. After collecting a sample of responses, we synthesized resulting themes to distill tangible solutions for the issues that we heard repeated back to us from users and providers alike.
Common themes according to data providers and data consumers
This exercise produced several themes that pointed to the general misalignment between data providers and data consumers. Mainly, respondents mentioned discrepancies between both groups in terms of both perceptions of data availability and procurement structures.
Data availability
The discussion around data availability is not a new one in this space. At ClimateTech Connect this Spring, our Climate Director, Catherine Oldershaw, had several conversations that highlighted a paradox – data consumers both felt that they were “drowning in data” and that there was a “frontier” of emerging data that would be potentially useful. This points to the fact that managing that data is currently cumbersome, and may become even more so if data providers aren’t thoughtful about integrating that data seamlessly into the consumers’ workflows. New data was desired particularly around ‘newer’ climate hazards like heat and hail. And data resilience is desired too, as much federally funded public climate data could become inaccessible. The challenge then, and a core theme that recurred, was how to make the data “decision-useful” and “at-the-ready.”
Data procurement structures
Similarly, several of our respondents from the latest STAPI Sprint in Lisbon noted a misalignment in procurement structures between data provider and client. As one of the largest buyers of EO data, many US federal government agencies prefer to sign a lump sum contract where they can then draw down on the funds based on the data needed at any given time. In many cases, however, this is not compatible with the commercial model of many private EO companies. The issue is not the cost, but simply the contracting mechanism.
Gaps in EO specific to data users
Many of the data users that we spoke to brought up industry-specific concerns. This highlights the need for case-oriented solutions, which will be discussed more in-depth in the next section. That said, a few of the most frequently cited data user concerns were cost and latency.
Cost
For the insurance and risk industry specifically, there was discussion around cost as a barrier to entry. Part of the issue of cost is the need for high resolution data with frequent revisits in order for the data to add value to their models and analysis. Additionally, many firms in the insurance and risk industry do not have the skillset in-house to be able to glean insights from EO data (especially lower resolution data). In these situations, cheaper (or free) lower resolution data is not an option and would-be users are forced to make-do without.
Latency
This Spring we also connected with a variety of folks from the disaster response field, we heard frustrations many times over related to EO data providing (or not providing) needed information in real time for both disaster responders and survivors. These gaps exist in terms of immediate access to data (otherwise known as latency), image resolution, and environmental conditions. When it comes to the sheer quantity of disasters that happen each year, many of which exceed $1 billion in damages, individuals involved in a disaster scenario are not able to wait for optimal conditions – in order for data to be helpful it needs to be immediately available and fully comprehensive.
Our take: the intersection of these concerns
Although our analysis is by no means a comprehensive cross section of the entire industry, this exercise has consistently revealed how crucial use cases are when it comes to problem solving in this space. As Aravind Ravichandran summarized after his successful EO Summit conference in New York, “Too much of the EO ecosystem still builds for users, not with them. The result: technically sophisticated tools that miss real-world workflows, decision cycles, and data literacy levels.” An individual’s largest perceived gap in EO depends fully on who you are asking. Even if one person gives you an in-depth response detailing an existential problem in their specific work, another stakeholder could find that element to be completely irrelevant.
This is true essentially across the board when we separate responses we received by theme:
- Cost of imagery is often not relevant to the government, but will render EO completely inaccessible to virtually the entire insurance industry.
- Lower resolution imagery works just fine when it comes to large-scale problems, such as for identifying deforestation or burn scars, but is entirely useless for identifying individual people.
- A larger amount of latency in data delivery is insurmountable when it comes to disaster response, but is perfectly serviceable when it comes to longer-term research.
- Variability in data delivery formats can range from representing an unsolvable problem to just proving to be a minor annoyance… depending on who you ask.
How to close the gap and meet user needs
Our entire purpose at Element 84 hinges around solving the world’s biggest problems – which can only happen when we are able to truly meet user needs. This conference season exercise has reemphasized the importance of starting with the specific use case and working backward from there to solve the problem. In other words, all of the most impressive technology in the world doesn’t mean much at all if it can’t be used to solve the real-world problems it was designed to solve. This necessitates that everything must be created with the user and the end use in mind. Although it is often tempting and possible to start with the dataside problems, the mini testimonials we received this season demonstrate that this approach risks minimizing the use case entirely.
If you’re interested in discussing this idea further, send us a message! We’d love to hear about your specific feelings regarding gaps in EO data or the EO space more generally, especially as it relates to meeting user needs more directly.