Time series analysis is one of the most common operations in Remote Sensing. It helps understanding and modeling of seasonal patterns as well as monitoring of land cover changes. Earth Engine is uniquely suited to allow extraction of dense time series over long periods of time.
In this post, I will go through different methods and approaches for time series extraction. While there are plenty of examples available that show how to extract a time series for a single location – there are unique challenges that come up when you need a time series for many locations spanning a large area. I will explain those challenges and present code samples to solve them.
The ultimate goal for this exercise is to extract NDVI time series from Sentinel-2 data over 1 year for 100 farm locations spanning an entire state in India.
Prefer a video-based guide?
Check out these series of videos that gives you a step-by-step instructions and explain the process of extracting NDVI time series in Earth Engine using MODIS data.
Preparing the data
FARM LOCATIONS
To make the analysis relevant, we need a dataset of farm locations spanning a large area. If you have your own data, you can upload the shapefile to Earth Engine and use it. But for the purpose of this post, I generated 100 random points. But I wanted those random points to meet certain criteria – such as they should be over farmland growing a certain crop. We can utilize the GFSAD1000: Cropland Extent 1km Crop Dominance, Global Food-Support Analysis Data from the Earth Engine Data Catalog to select all pixels which are farmland growing wheat and rice. The stratifiedSample()
method allows us to then generate sample points within those pixels – approximating a dataset of 100 farm locations.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 |
|
SENTINEL-2 IMAGE COLLECTION
We will use atmospherically corrected Sentinel-2 Surface Reflectance Data. To use this in our analysis, we should filter the collection to images overlapping with the farm locations and those within the time range. It is also important to apply cloud masking to remove cloudy pixels from the analysis. This part is fairly straightforward where you map functions to remove cloud and add NDVI bands and then filter it down to a date range and location.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 |
|
Get Time Series for a Single Location
At this point, our collection has images spanning a full year. If we wanted to extract NDVI values at any location for the full year, it is quite easy. We can use the built-in charting functions to chart the NDVI value over time.
Our collection has 100 points. We call .first(
) to get the first point from the collection and create a chart using the ui.Chart.image.series()
function. Once you print a chart, you can click the
button next to it to get an option to download the data as a CSV.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 |
|
This is a nice NDVI time-series chart showing the dual-cropping practice common in India.
EXPORTING TIME SERIES FOR A SINGLE LOCATION/REGION
If you want a time-series over a polygon, the above technique still works. But if the region is large and your time series is long – you may still run into ‘Computation Time Out’ errors. In that case, we can Export the results as a CSV. We can use the reduceRegion()
function to get the NDVI value from an image. Since we want to do that for all images in the collection, we need to map() a function
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 |
|
Getting Time Series for Multiple Locations
While you can chart or export time series for a single location as shown above, things start getting a bit more complex when you want to do the same for many locations. Continuing the charting method above, you may think of using the ui.Chart.image.seriesByRegion()
function to get a chart for all 100 points over the year. But you will start hitting the limit of what can be done in Earth Engine’s ‘interactive’ mode.
1 2 3 4 5 6 7 |
|
This is understandable. Earth Engine limits the execution time in the interactive mode to 5 minutes, and times out if your computation takes longer. In such cases, the recommendation is to switch to using the ‘batch’ mode, which has a lot more resources and can run the computation for a long time. The way to use the batch mode is using any of the Export
functions.
The method to export a time-series is explained well in this tutorial. The code has a clever way of organizing the results to reduceRegions()
into a table that can be exported. This code works when your points do not span a large area. If you tried using this approach for this example, you will run into problems.
PROBLEM 1: HANDLING MASKED PIXELS
As we have masked cloudy pixels in source images, those pixels will return null values, resulting in a data gap. As our area spans multiple images, for any given point, majority of the images will not intersect the point and return a null value. We can fix this by assigning a NoData value (such as -9999) to a missing value in the time series. Specifically, we use ee.Reducer.firstNonNull()
function to programmatically assign -9999 to any output containing null value. Below is the modified code that generates a table with each point id as the row and NDVI values from each image as columns.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 |
|
PROBLEM 2: GRANULE OVERLAPS
The second problem is specific to Sentinel-2 data and how individual images are produced from the raw data.. If you are working with any other dataset (Landsat, MODIS etc.), skip this step and Export the collection generated in the previous step.
The sentinel data is distributed as granules, also called tiles – which are 100×100 km2 ortho-images. As you can see the map below, there is an overlap between neighboring granules. So the same raw pixel can be in present in upto 4 tiles. And since each granule is processed independently, the output pixel values can be slightly different.
If we exported the table generated in the previous step, we will see multiple NDVI values for the same day which may or may not be the same. For our time series to be consistent, we need to harmonize these overlapping pixels. A decent solution is to take all NDVI values for the same day (generated from the same raw pixels) and assign the maximum of all values to that day. This results in a clean output with 1 NDVI value per point per day.
The following code finds all images of the same day and creates a single output for the day with the maximum of all NDVI values.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 |
|
Exporting the Time Series
The collection now contains formatted output. It can be exported as a CSV file. Running the code below will create an Export task. Click Run, confirm the parameters and start the task. Once the export task finishes, you will have the CSV file in your Google Drive.
1 2 3 4 5 6 7 |
|
You can see the full script at https://code.earthengine.google.co.in/1e0980f450591b45d3d1dc07ebcc0364
Here is the resulting CSV file.
Hope you found the post useful and got some inspiration to apply it to your own problem. Do leave a comment if you have ideas to improve the code.
If you are new to Earth Engine and want to master it, check out my course End-to-End Google Earth Engine.