Inside Sofar Ocean Technologies' Epic Quest to Open Ocean Data
- “We’re building standardization to enable a rapid expansion of sensing in the ocean. (Traditionally) every sensor, every piece of hardware is different, with a different protocol, and a different connector. If you try to put them together, things get ugly quick, and everything turns into an engineering project. Engineering projects are great for engineers, (but) they’re bad for almost everybody else (because) they cost a lot of time and money.” Tim Janssen, CEO , Sofar Ocean Technologies. Image
- Image courtesy SOFAR
- Image courtesy SOFAR
Tim Janssen, CEO, Sofar Ocean Technologies, discusses this real-time ocean intelligence platform’s quest to collect, network and distribute vast amounts of ocean information and insight, essentially creating the ‘nervous system for the oceans.’
- Tim, to start, please give us a by the numbers look at your company?
We are a startup, with 50 people in our San Francisco office, and 50 on the water. We have more than 1,200 live sensors right now, one of the largest networks ever created for ocean sensing. We have more than 5 million ocean hours on our platform and we collect more than 100,000 data points daily. What that data does is it allows us to improve weather forecast over the oceans, and particularly for instance, our wave forecast; we can get 30 to 50% improvements for every operational cycle. Also, we deliver insights for shipping, route optimization. We’re trying to use sensor information to create insights that were previously not available.
We talk about getting information from the ocean and turning it into actionable data on a daily basis. What is unique about the Sofar approach?
“Ocean intelligence” is new, and as a result, it has to be inherently full stack. Data in itself is relevant for experts, but that data in itself is not that relevant for the everyday user. You need to be an expert to understand how that (data) changes forecasts. We create the insights from our sensors, and our sensor network is basically the central nervous system of our oceans. What we do is drive (data) into forecast models to translate it into better (weather and forecast) insights, (helping to) reduce the uncertainty around, for example, what waves and winds are going to be doing over the ocean. Specifically for maritime shipping, we deliver better options for routing their vessels for both safety, efficiency, and reducing emissions.
- I know you’ve already given me a couple of key statistics, but I’d if you would to put a little more meat on the bones.
First off, I think an important difference between what we do and what we have been doing in our oceans is that we are switching to a distributed paradigm. This is fundamentally not that different from what has been done in space over the last two decades, where you see a shift from single exquisite platforms that are incredibly expensive and mostly government-owned, to networks of lower cost nodes. Together all of those lower cost nodes can provide much more information, much more synoptic insight as to what’s actually going on. Basically, we are taking that same idea and bringing it to the ocean, where we’ve been mostly pretty bad at doing that; stuck to building large platforms that require large operational support; needing a PhD in oceanography operate the instrument. As a result, (that type of network) does not scale.
- Why is scale important?
Scale is fundamental to what we do. Everything has to be global scale, thousands of sensors, lots of data. The platform that we use is the Spotter platform, which is a solar powered, satellite connected, metocean buoy. In addition, and this is really critical, we’re building standardization to enable a rapid expansion of sensing in the ocean. (Traditionally) every sensor, every piece of hardware is different, with a different protocol, and a different connector. If you try to put them together, things get ugly quick, and everything turns into an engineering project. Engineering projects are great for engineers, (but) they’re bad for almost everybody else (because) they cost a lot of time and money. What we want to do is create large, heterogeneous networks of sensors that cover much more than we’ve ever been able to do before. Creating a standardization discipline is not just for us; this going to be completely open to the community to enable faster innovation on the sensor level and integration into platforms like ours to create broader capability. What it does immediately for us is that it enables us to integrate subsurface sensors, so our Spotter platform now can also measure water levels, salinity, temperature, etc.; and any other subsurface sensors that could benefit from having a real time connectivity platform associated with it. So fundamentally in terms of cost, what’s really important is that the cost of the nodes has to become incredibly small.
To be successful and at scale, thousands, tens of thousands of sensors, complete planetary coverage, we have to bring the cost down of the individual nodes radically.
That’s basically what we’re working on in order to drive the scale. (So today we’re) focusing on scale first, being disciplined around the hardware that we’re building and trying to enable the community to drive the innovation needed to grow faster.
- What do you consider to be the biggest challenge to keeping this network functional and growing?
I think the most important thing about maintaining something like this is to show its value. Fundamentally what we’re focusing on is in creating value out of this data, creating insights that haven’t been available before. So there are two parts to this. One is broad application: driving innovation and standardization across the community, because with that, everything we build becomes more valuable. For example, communication standardization specifically designed for marine connectivity so the integration of components into platforms, (such as) our Spotter platform, helps the entire industry innovate faster and more efficiently.
The other part of it is focused insight. (An example of this is) Wayfinder, our maritime shipping route optimization capability, driven and powered by the fact that we have all this additional and unique information across the ocean.
- When you look at the market today, which technologies do you could believe could be the most transformational to help gather and deliver better ocean information faster and cheaper?
‘Cheap’ is a means to an end; it’s really scale that matters. The capabilities that we have developed to date were enabled by advances across multiple technologies. The challenges that you have in collecting ocean data; one, it’s the ocean, so it has salt water and storms and anything electronic generally doesn’t like that very much. (But to answer your question): communications have traditionally been very expensive through monopolies in satellite communications, and that’s changing rapidly. Power. Advances in battery technology and solar capture has made it possible for us to build a low cost completely autonomous system that can stay out in the ocean forever and provide useful information. More generally, advances in IOT technology, the ability to build something extremely low cost that is as capable as maybe your home computer was 20 years ago.
Other things like material sciences, are also obviously important. But the point being is that it’s not a single innovation that makes it possible to do what we do today, and I don’t expect it to be a single innovation that’s going to basically, fundamentally change what we’re going to be doing in the future. I think one of the things that has been missing is standardization. Nobody else was doing it, (so) we figured we have to do that. I see (radical standardization) as the single most important advancement in the industry.
- Can you give us some detailed insights on organizations that are using your services today?
Basically we have three things that we sell and provide to customers. One is hardware, to enable folks to collect data from the ocean, democratizing access to ocean data. One example is Aqualink, a philanthropy that’s focused on monitoring in real time the health of coral reefs, and particularly looking at reef bleaching induced by heat waves in the ocean.
The second part of what we deliver is large amounts of unique data. That’s a very different type of customer, and you can think about large government agencies that run their own forecasting systems or intelligence agencies that would like to have unique information around what’s going on in the ocean right now.
And finally, insights. The shipping industry is our last customer layer, and an example there is Berge Bulk which has been using our system to improve the efficiency of their largest vessels shuttling between Brazil and China.
- What’s next for Sofar Ocean Technologies?
A lot, as we have just scratched the surface. The whole space, the maritime space, the ocean space, is very young, and we have a long way to go to take this to the next level. And we have to, because the pressure is on, as it is pretty clear to everyone that ocean dynamics is critical for understanding global weather and climate.
We’ve just closed our series B financing, which for a start-up is a big deal. Our current round, led by Foundry Group and Union Square Ventures, is an important step, but the excitement (for this type of environmental and ocean intelligence) in general in the investor community is massive.
We have got to get better and faster at what we’re doing. We’ve learned a ton, but we’ve also learned everything we shouldn’t do. (We have to) get better at figuring out better ways to deliver real value from ocean intelligence to our customers. Finally, the biggest hurdle to get to large distributed heterogeneous networks of sensors around the world is standardization. And this is not just impacting us; this is impacting every startup in the space, whether you build an autonomous surface vehicle, a glider, or another sensor platform. Aggressive standardization will help the entire community advance to the next stage.
* This interview was edited for clarity and brevity. Watch the full interview with Tim here on Marine Technology TV:
Who: Sofar Ocean Technologies is a real-time ocean intelligence platform based in San Francisco, California.
What: Sofar has a global distributed network of thousands of sensors designed to capture data to drive unique weather forecasts and enable dynamic vessel routes. Cumulatively, these sensors have already racked up over five million ocean hours. Where Sofar seeks to differentiate itself is in its interconnected and global network and the vast amounts of data collected, leveraging scale and accessibility. Sofar devices are designed to be ‘cost-effective’ and scalable depending on customer needs. From the Southern Ocean to the kelp forests of California and the coral reefs of Australia, Sofar is delivering more than 100,000 unique ocean data points daily.
Why: While the oceans cover over 70% of the Earth’s surface, we currently have an underwhelming amount of data about it, even though 3.5% of the world’s GDP, which is close to $3 trillion, happens on the ocean’s surface. To better predict the future state of weather and climate over our oceans, data density is critical.
- June 2021: Sofar opened access to its proprietary data platform to all scientists.
- September 2021: Sofar introduced a new marine hardware standard, Bristlemouth, aimed at catalyzing more collaboration, research and innovation for big data from the oceans with partners DARPA, the Office of Naval Research & Oceankind.
- November 2021: Sofar announced a $39 million Series B syndicate round with Union Square Ventures and the Foundry Group.