How Google’s New Weather AI Will Make Sure You Never Get Caught in the Rain

 

Among the many things we’ve become addicted to on our smartphones is checking the weather. If you’re anything like me, you open a weather app at least twice a day: in the morning to know what to expect for the day ahead, maybe before your commute home so you can prepare for possible rain or snow, and sometimes before bed to get an idea of what to wear or what activities to plan for the next day. Depending where you live, how much time you spend outside, and how prone your area is to rapid weather changes, maybe you check the forecast even more frequently than that.

The fact that our phones now contain hour-by-hour breakdowns of temperature and likelihood of precipitation means we can be well-informed and well-prepared. But these forecasts are coming at a greater cost than we know, and they’re not always right.

On Monday, a post on Google’s AI blog shared a machine learning method the company is developing for weather prediction. Google calls the technique “nowcasting” because it’s set up to predict weather zero to six hours ahead of time, and is focused around weather events like thunderstorms that can quickly morph from clear skies to heavy rains to gusting wind and back again.

By simplifying its methodology and actually using less data than existing forecasting techniques, the company believes it can give us accurate, timely weather predictions, especially ones relating to precipitation.

One of the biggest problems with existing forecasting is the amount of data it incorporates, and the amount of computing power needed to process and make sense of all that data. The US National Oceanic and Atmospheric Administration (NOAA) alone collects 100 terabytes of data per day. It’s input to forecasting models that simulate everything from atmospheric dynamics and thermal radiation to lake and ocean effects.

The forecasting engines run on supercomputers, but they’re limited by their heavy need for computing power. According to Google’s blog post, computational demands limit weather prediction to a spatial resolution of about 5 kilometers (3.1 miles), and the simulations take several hours to run. If, for example, it takes six hours to compute a forecast, that allows only three to four runs per day and results in forecasts based on data that’s six hours old or more.

Google presents its method as being far simpler, describing it as highly localized, data-driven, physics-free, and low-latency. Essentially, the method turns weather forecasting into a computer vision problem; based on progressive images of the formation and movement of clouds over a short time period, a machine learning algorithm predicts how the pattern will evolve over the subsequent few hours.

Specifically, Google uses a convolutional neural network (CNN), a type of deep learning algorithm whose architecture is particularly conducive to image analysis. The “physics-free” descriptor means the neural network learns only from its training data and doesn’t incorporate knowledge of how the atmosphere works; all it has to go off of are patterns it identifies in the images it’s fed.

Google points out that from a computing power perspective, this method is much more economical than existing forecast techniques, especially once the model is already trained. The company claims its algorithm can generate forecasts that have a one-kilometer resolution with a latency of five to ten minutes.

When compared to three common forecasting models—the NOAA’s high resolution rapid refresh numerical forecast, an optical flow algorithm, and a persistence model—Google claims the precision of its method outperformed all of these for prediction timespans of less than five to six hours, emphasizing that low latency makes its predictions “effectively instantaneous.”

So instead of checking the weather on our phones a couple times a day or every few hours, it may not be long before we can access accurate forecasts every minute.

We could also just look out the window—or put our phones down and go for a walk outside.

Image Credit:  Todd Diemer , CWEB 

By

This article originally appeared on Singularity Hub, a publication of Singularity University.


Follow us on Google news for more updates and News










PLEASE READ THE IMPORTANT DISCLOSURES BELOW.

This content is being provided to you for informational purposes only. The content has been prepared by third parties not affiliated with CWEB Inc, a business. This content and any information contained therein, does not constitute a recommendation by CWEB to buy, sell or hold any security, financial product or instrument referenced in the content. This information neither is, nor should be construed as an offer, or a solicitation of an offer, to buy or sell securities by CWEB Inc. CWEB Inc. does not offer or provide any opinion regarding the nature, potential, value, suitability or profitability of any particular investment or investment strategy, and you shall be fully responsible for any investment decisions you make, and such decisions will be based solely on your evaluation of your financial circumstances, investment objectives, risk tolerance, and liquidity needs.

Unless stated otherwise, the web content provided by the CWEB family of companies is for educational purposes only. The information and tools provided neither are, nor should be construed, as an offer, or a solicitation of an offer, to buy or sell securities by CWEB Inc. or its affiliates. Unless stated otherwise, no information presented constitutes a recommendation by CWEB Inc. or its affiliates to buy, sell or hold any security, financial product or instrument discussed therein or to engage in any specific investment strategy.

Full Disclaimer

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.