Extracting Time Series using Google Earth Engine

Time series analysis is one of the most common operations in Remote Sensing. It helps understanding and modeling of seasonal patterns as well as monitoring of land cover changes. Earth Engine is uniquely suited to allow extraction of dense time series over long periods of time.

In this post, I will go through different methods and approaches for time series extraction. While there are plenty of examples available that show how to extract a time series for a single location – there are unique challenges that come up when you need a time series for many locations spanning a large area. I will explain those challenges and present code samples to solve them.

The ultimate goal for this exercise is to extract NDVI time series from Sentinel-2 data over 1 year for 100 farm locations spanning an entire state in India.

Prefer a video-based guide?

Check out these series of videos that gives you a step-by-step instructions and explain the process of extracting NDVI time series in Earth Engine using MODIS data.

Preparing the data

FARM LOCATIONS

To make the analysis relevant, we need a dataset of farm locations spanning a large area. If you have your own data, you can upload the shapefile to Earth Engine and use it. But for the purpose of this post, I generated 100 random points. But I wanted those random points to meet certain criteria – such as they should be over farmland growing a certain crop. We can utilize the GFSAD1000: Cropland Extent 1km Crop Dominance, Global Food-Support Analysis Data from the Earth Engine Data Catalog to select all pixels which are farmland growing wheat and rice. The stratifiedSample() method allows us to then generate sample points within those pixels – approximating a dataset of 100 farm locations.

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

var gaul = ee.FeatureCollection("FAO/GAUL/2015/level1");

var gfsad = ee.Image("USGS/GFSAD1000_V0");

// Select 'landcover' band with pixel values 1

// which represent Rice and Wheat Rainfed crops

var wheatrice = gfsad.select('landcover').eq(1)

// Uttar Pradesh is a large state in Indo-Gangetic Plain with

// a large agricultural output.

// We use the Global Administrative Unit Layers (GAUL) dataset to get the state boundary

var uttarpradesh = gaul.filter(ee.Filter.eq('ADM1_NAME', 'Uttar Pradesh'))

// wheatrice image contains 1 and 0 pixels. We want to generate points

// only in the pixels that are 1 (representing crop areas)

// selfMask() masks the pixels with 0 value.

var points = wheatrice.selfMask().stratifiedSample({numPoints:100, region:uttarpradesh, geometries: true} )

// We need a unique id for each point. We take the feature id and set it as

// a property so we can refer to each point easily

var points = points.map(function(feature) {

  return ee.Feature(feature.geometry(), {'id': feature.id()})

})

// Show the state polygon with a blue outline

var outline = ee.Image().byte().paint({

  featureCollection: uttarpradesh,

  color: 1,

  width: 3

});

Map.addLayer(outline, {palette: ['blue']}, 'AOI')

// Show the farm locations in green

Map.addLayer(points, {color: 'green'}, 'Farm Locations')

SENTINEL-2 IMAGE COLLECTION

We will use atmospherically corrected Sentinel-2 Surface Reflectance Data. To use this in our analysis, we should filter the collection to images overlapping with the farm locations and those within the time range. It is also important to apply cloud masking to remove cloudy pixels from the analysis. This part is fairly straightforward where you map functions to remove cloud and add NDVI bands and then filter it down to a date range and location.

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

// Function to remove cloud and snow pixels

function maskCloudAndShadows(image) {

  var cloudProb = image.select('MSK_CLDPRB');

  var snowProb = image.select('MSK_SNWPRB');

  var cloud = cloudProb.lt(5);

  var snow = snowProb.lt(5);

  var scl = image.select('SCL');

  var shadow = scl.eq(3); // 3 = cloud shadow

  var cirrus = scl.eq(10); // 10 = cirrus

  // Cloud probability less than 5% or cloud shadow classification

  var mask = (cloud.and(snow)).and(cirrus.neq(1)).and(shadow.neq(1));

  return image.updateMask(mask);

}

// Adding a NDVI band

function addNDVI(image) {

  var ndvi = image.normalizedDifference(['B8', 'B4']).rename('ndvi')

  return image.addBands([ndvi])

}

var startDate = '2019-01-01'

var endDate = '2019-12-31'

// Use Sentinel-2 L2A data - which has better cloud masking

var collection = ee.ImageCollection('COPERNICUS/S2_SR')

    .filterDate(startDate, endDate)

    .map(maskCloudAndShadows)

    .map(addNDVI)

    .filter(ee.Filter.bounds(points))

// View the median composite

var vizParams = {bands: ['B4', 'B3', 'B2'], min: 0, max: 2000}

Map.addLayer(collection.median(), vizParams, 'collection')

Get Time Series for a Single Location

At this point, our collection has images spanning a full year. If we wanted to extract NDVI values at any location for the full year, it is quite easy. We can use the built-in charting functions to chart the NDVI value over time.

Our collection has 100 points. We call .first() to get the first point from the collection and create a chart using the ui.Chart.image.series() function. Once you print a chart, you can click the 

 button next to it to get an option to download the data as a CSV.

1

2

3

4

5

6

7

8

9

10

11

12

13

14

var testPoint = ee.Feature(points.first())

//Map.centerObject(testPoint, 10)

var chart = ui.Chart.image.series({

    imageCollection: collection.select('ndvi'),

    region: testPoint.geometry()

    }).setOptions({

      interpolateNulls: true,

      lineWidth: 1,

      pointSize: 3,

      title: 'NDVI over Time at a Single Location',

      vAxis: {title: 'NDVI'},

      hAxis: {title: 'Date', format: 'YYYY-MMM', gridlines: {count: 12}}

    })

print(chart)

This is a nice NDVI time-series chart showing the dual-cropping practice common in India.

EXPORTING TIME SERIES FOR A SINGLE LOCATION/REGION

If you want a time-series over a polygon, the above technique still works. But if the region is large and your time series is long – you may still run into ‘Computation Time Out’ errors. In that case, we can Export the results as a CSV. We can use the reduceRegion() function to get the NDVI value from an image. Since we want to do that for all images in the collection, we need to map() a function

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

var filteredCollection = collection.select('ndvi')

  .filter(ee.Filter.bounds(testPoint.geometry()))

var timeSeries = ee.FeatureCollection(filteredCollection.map(function(image) {

  var stats = image.reduceRegion({

    reducer: ee.Reducer.mean(),

    geometry: testPoint.geometry(),

    scale: 10,

    maxPixels: 1e10

  })

  // reduceRegion doesn't return any output if the image doesn't intersect

  // with the point or if the image is masked out due to cloud

  // If there was no ndvi value found, we set the ndvi to a NoData value -9999

  var ndvi = ee.List([stats.get('ndvi'), -9999])

    .reduce(ee.Reducer.firstNonNull())

  // Create a feature with null geometry and NDVI value and date as properties

  var f = ee.Feature(null, {'ndvi': ndvi,

    'date': ee.Date(image.get('system:time_start')).format('YYYY-MM-dd')})

  return f

}))

// Check the results

print(timeSeries.first())

// Export to CSV

Export.table.toDrive({

    collection: timeSeries,

    description: 'Single_Location_NDVI_time_series',

    folder: 'earthengine',

    fileNamePrefix: 'ndvi_time_series_single',

    fileFormat: 'CSV'

})

Getting Time Series for Multiple Locations

While you can chart or export time series for a single location as shown above, things start getting a bit more complex when you want to do the same for many locations. Continuing the charting method above, you may think of using the ui.Chart.image.seriesByRegion()function to get a chart for all 100 points over the year. But you will start hitting the limit of what can be done in Earth Engine’s ‘interactive’ mode.

1

2

3

4

5

6

7

var chart = ui.Chart.image.seriesByRegion({

    imageCollection: collection.select('ndvi'),

    regions: points,

    reducer: ee.Reducer.mean()

})

// This doesn't work as the result is to large to print

print(chart)

This is understandable. Earth Engine limits the execution time in the interactive mode to 5 minutes, and times out if your computation takes longer. In such cases, the recommendation is to switch to using the ‘batch’ mode, which has a lot more resources and can run the computation for a long time. The way to use the batch mode is using any of the Export functions.

The method to export a time-series is explained well in this tutorial. The code has a clever way of organizing the results to reduceRegions() into a table that can be exported. This code works when your points do not span a large area. If you tried using this approach for this example, you will run into problems.

PROBLEM 1: HANDLING MASKED PIXELS

As we have masked cloudy pixels in source images, those pixels will return null values, resulting in a data gap. As our area spans multiple images, for any given point, majority of the images will not intersect the point and return a null value. We can fix this by assigning a NoData value (such as -9999) to a missing value in the time series. Specifically, we use ee.Reducer.firstNonNull() function to programmatically assign -9999 to any output containing null value. Below is the modified code that generates a table with each point id as the row and NDVI values from each image as columns.

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

var triplets = collection.map(function(image) {

  return image.select('ndvi').reduceRegions({

    collection: points,

    reducer: ee.Reducer.mean().setOutputs(['ndvi']),

    scale: 10,

  })// reduceRegion doesn't return any output if the image doesn't intersect

    // with the point or if the image is masked out due to cloud

    // If there was no ndvi value found, we set the ndvi to a NoData value -9999

    .map(function(feature) {

    var ndvi = ee.List([feature.get('ndvi'), -9999])

      .reduce(ee.Reducer.firstNonNull())

    return feature.set({'ndvi': ndvi, 'imageID': image.id()})

    })

  }).flatten();

var format = function(table, rowId, colId) {

  var rows = table.distinct(rowId);

  var joined = ee.Join.saveAll('matches').apply({

    primary: rows,

    secondary: table,

    condition: ee.Filter.equals({

      leftField: rowId,

      rightField: rowId

    })

  });

         

  return joined.map(function(row) {

      var values = ee.List(row.get('matches'))

        .map(function(feature) {

          feature = ee.Feature(feature);

          return [feature.get(colId), feature.get('ndvi')];

        });

      return row.select([rowId]).set(ee.Dictionary(values.flatten()));

    });

};

var sentinelResults = format(triplets, 'id', 'imageID');

PROBLEM 2: GRANULE OVERLAPS

The second problem is specific to Sentinel-2 data and how individual images are produced from the raw data.. If you are working with any other dataset (Landsat, MODIS etc.), skip this step and Export the collection generated in the previous step.

The sentinel data is distributed as granules, also called tiles – which are 100×100 km2 ortho-images. As you can see the map below, there is an overlap between neighboring granules. So the same raw pixel can be in present in upto 4 tiles. And since each granule is processed independently, the output pixel values can be slightly different.

If we exported the table generated in the previous step, we will see multiple NDVI values for the same day which may or may not be the same. For our time series to be consistent, we need to harmonize these overlapping pixels. A decent solution is to take all NDVI values for the same day (generated from the same raw pixels) and assign the maximum of all values to that day. This results in a clean output with 1 NDVI value per point per day.

The following code finds all images of the same day and creates a single output for the day with the maximum of all NDVI values.

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

// There are multiple image granules for the same date processed from the same orbit

// Granules overlap with each other and since they are processed independently

// the pixel values can differ slightly. So the same pixel can have different NDVI

// values for the same date from overlapping granules.

// So to simplify the output, we can merge observations for each day

// And take the max ndvi value from overlapping observations

var merge = function(table, rowId) {

  return table.map(function(feature) {

    var id = feature.get(rowId)

    var allKeys = feature.toDictionary().keys().remove(rowId)

    var substrKeys = ee.List(allKeys.map(function(val) {

        return ee.String(val).slice(0,8)}

        ))

    var uniqueKeys = substrKeys.distinct()

    var pairs = uniqueKeys.map(function(key) {

      var matches = feature.toDictionary().select(allKeys.filter(ee.Filter.stringContains('item', key))).values()

      var val = matches.reduce(ee.Reducer.max())

      return [key, val]

    })

    return feature.select([rowId]).set(ee.Dictionary(pairs.flatten()))

  })

}

var sentinelMerged = merge(sentinelResults, 'id');

Exporting the Time Series

The collection now contains formatted output. It can be exported as a CSV file. Running the code below will create an Export task. Click Run, confirm the parameters and start the task. Once the export task finishes, you will have the CSV file in your Google Drive.

1

2

3

4

5

6

7

Export.table.toDrive({

    collection: sentinelMerged,

    description: 'Multiple_Locations_NDVI_time_series',

    folder: 'earthengine',

    fileNamePrefix: 'ndvi_time_series_multiple',

    fileFormat: 'CSV'

})

You can see the full script at https://code.earthengine.google.co.in/1e0980f450591b45d3d1dc07ebcc0364

Here is the resulting CSV file.

Hope you found the post useful and got some inspiration to apply it to your own problem. Do leave a comment if you have ideas to improve the code.

If you are new to Earth Engine and want to master it, check out my course End-to-End Google Earth Engine.

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值