【python数据分析】pandas库时间序列二

1、重采样

# 重采样 resample()
import pandas as pd
import numpy as np
from datetime import datetime

rng = pd.date_range('20190101',periods = 12)
ts = pd.Series(np.arange(12),index = rng)
print(ts)

# 得到一个重采样构造器,频率为5天
ts_re = ts.resample('5D')
# 得到一个行的聚合后的Series,聚合方式为求和
ts_re2 = ts.resample('5D').sum()
print(ts_re)
print(ts_re2)

print('-------')
print(ts.resample('5D').mean(),'->求平均值')
print(ts.resample('5D').max(),'->求最大值')
print(ts.resample('5D').min(),'->求最小值')
print(ts.resample('5D').median(),'->求中间值')
print(ts.resample('5D').max(),'->求最大值')
print(ts.resample('5D').first(),'->求第一个值')
print(ts.resample('5D').last(),'->求最后一个值')
print(ts.resample('5D').ohlc(),'->OHLC重采样')
# OHLC:金融领域的时间序列聚合方式,open:开盘 heigh:最大值 low:最小值 close:收盘

--------------------------结果----------------------
2019-01-01     0
2019-01-02     1
2019-01-03     2
2019-01-04     3
2019-01-05     4
2019-01-06     5
2019-01-07     6
2019-01-08     7
2019-01-09     8
2019-01-10     9
2019-01-11    10
2019-01-12    11
Freq: D, dtype: int32
DatetimeIndexResampler [freq=<5 * Days>, axis=0, closed=left, label=left, convention=start, base=0]
2019-01-01    10
2019-01-06    35
2019-01-11    21
Freq: 5D, dtype: int32
-------
2019-01-01     2.0
2019-01-06     7.0
2019-01-11    10.5
Freq: 5D, dtype: float64 ->求平均值
2019-01-01     4
2019-01-06     9
2019-01-11    11
Freq: 5D, dtype: int32 ->求最大值
2019-01-01     0
2019-01-06     5
2019-01-11    10
Freq: 5D, dtype: int32 ->求最小值
2019-01-01     2.0
2019-01-06     7.0
2019-01-11    10.5
Freq: 5D, dtype: float64 ->求中间值
2019-01-01     4
2019-01-06     9
2019-01-11    11
Freq: 5D, dtype: int32 ->求最大值
2019-01-01     0
2019-01-06     5
2019-01-11    10
Freq: 5D, dtype: int32 ->求第一个值
2019-01-01     4
2019-01-06     9
2019-01-11    11
Freq: 5D, dtype: int32 ->求最后一个值
            open  high  low  close
2019-01-01     0     4    0      4
2019-01-06     5     9    5      9
2019-01-11    10    11   10     11 ->OHLC重采样

2、降采样,高频率转低频率

# 降采样 高频率转低频率

import pandas as pd
import numpy as np
from datetime import datetime

rng = pd.date_range('20190101',periods = 12)
ts = pd.Series(np.arange(12),index = rng)
print(ts)

# closed:个时间段那一段时闭合的,默认 左闭右闭
# left结果为:[1,2,3,4,5],[6,7,8,9,10],[11,12]
# right结果为:[1],[2,3,4,5,6],[7,8,9,10,11],[12]
print(ts.resample('5D',closed='left').sum())
print(ts.resample('5D',closed='right').sum())

-------------------结果-----------------------
2019-01-01     0
2019-01-02     1
2019-01-03     2
2019-01-04     3
2019-01-05     4
2019-01-06     5
2019-01-07     6
2019-01-08     7
2019-01-09     8
2019-01-10     9
2019-01-11    10
2019-01-12    11
Freq: D, dtype: int32
2019-01-01    10
2019-01-06    35
2019-01-11    21
Freq: 5D, dtype: int32
2018-12-27     0
2019-01-01    15
2019-01-06    40
2019-01-11    11
Freq: 5D, dtype: int32

2、升采样,低频转高频率

# 升采样 低频转高频率

import pandas as pd
import numpy as np
from datetime import datetime

rng = pd.date_range('20190101',periods = 3)
ts = pd.Series(np.arange(3),index = rng)
print(ts)

print(ts.resample('15T').asfreq())
print(ts.resample('15T').ffill())
print(ts.resample('15T').bfill())
# asfreq():不做填充,返回nan
# ffill():向上填充
# bfill:向下填充

-----------------------结果------------------
2019-01-01    0
2019-01-02    1
2019-01-03    2
Freq: D, dtype: int32
2019-01-01 00:00:00    0.0
2019-01-01 00:15:00    NaN
2019-01-01 00:30:00    NaN
2019-01-01 00:45:00    NaN
2019-01-01 01:00:00    NaN
2019-01-01 01:15:00    NaN
2019-01-01 01:30:00    NaN
2019-01-01 01:45:00    NaN
2019-01-01 02:00:00    NaN
2019-01-01 02:15:00    NaN
2019-01-01 02:30:00    NaN
2019-01-01 02:45:00    NaN
2019-01-01 03:00:00    NaN
2019-01-01 03:15:00    NaN
2019-01-01 03:30:00    NaN
2019-01-01 03:45:00    NaN
2019-01-01 04:00:00    NaN
2019-01-01 04:15:00    NaN
2019-01-01 04:30:00    NaN
2019-01-01 04:45:00    NaN
2019-01-01 05:00:00    NaN
2019-01-01 05:15:00    NaN
2019-01-01 05:30:00    NaN
2019-01-01 05:45:00    NaN
2019-01-01 06:00:00    NaN
2019-01-01 06:15:00    NaN
2019-01-01 06:30:00    NaN
2019-01-01 06:45:00    NaN
2019-01-01 07:00:00    NaN
2019-01-01 07:15:00    NaN
                      ... 
2019-01-02 16:45:00    NaN
2019-01-02 17:00:00    NaN
2019-01-02 17:15:00    NaN
2019-01-02 17:30:00    NaN
2019-01-02 17:45:00    NaN
2019-01-02 18:00:00    NaN
2019-01-02 18:15:00    NaN
2019-01-02 18:30:00    NaN
2019-01-02 18:45:00    NaN
2019-01-02 19:00:00    NaN
2019-01-02 19:15:00    NaN
2019-01-02 19:30:00    NaN
2019-01-02 19:45:00    NaN
2019-01-02 20:00:00    NaN
2019-01-02 20:15:00    NaN
2019-01-02 20:30:00    NaN
2019-01-02 20:45:00    NaN
2019-01-02 21:00:00    NaN
2019-01-02 21:15:00    NaN
2019-01-02 21:30:00    NaN
2019-01-02 21:45:00    NaN
2019-01-02 22:00:00    NaN
2019-01-02 22:15:00    NaN
2019-01-02 22:30:00    NaN
2019-01-02 22:45:00    NaN
2019-01-02 23:00:00    NaN
2019-01-02 23:15:00    NaN
2019-01-02 23:30:00    NaN
2019-01-02 23:45:00    NaN
2019-01-03 00:00:00    2.0
Freq: 15T, Length: 193, dtype: float64
2019-01-01 00:00:00    0
2019-01-01 00:15:00    0
2019-01-01 00:30:00    0
2019-01-01 00:45:00    0
2019-01-01 01:00:00    0
2019-01-01 01:15:00    0
2019-01-01 01:30:00    0
2019-01-01 01:45:00    0
2019-01-01 02:00:00    0
2019-01-01 02:15:00    0
2019-01-01 02:30:00    0
2019-01-01 02:45:00    0
2019-01-01 03:00:00    0
2019-01-01 03:15:00    0
2019-01-01 03:30:00    0
2019-01-01 03:45:00    0
2019-01-01 04:00:00    0
2019-01-01 04:15:00    0
2019-01-01 04:30:00    0
2019-01-01 04:45:00    0
2019-01-01 05:00:00    0
2019-01-01 05:15:00    0
2019-01-01 05:30:00    0
2019-01-01 05:45:00    0
2019-01-01 06:00:00    0
2019-01-01 06:15:00    0
2019-01-01 06:30:00    0
2019-01-01 06:45:00    0
2019-01-01 07:00:00    0
2019-01-01 07:15:00    0
                      ..
2019-01-02 16:45:00    1
2019-01-02 17:00:00    1
2019-01-02 17:15:00    1
2019-01-02 17:30:00    1
2019-01-02 17:45:00    1
2019-01-02 18:00:00    1
2019-01-02 18:15:00    1
2019-01-02 18:30:00    1
2019-01-02 18:45:00    1
2019-01-02 19:00:00    1
2019-01-02 19:15:00    1
2019-01-02 19:30:00    1
2019-01-02 19:45:00    1
2019-01-02 20:00:00    1
2019-01-02 20:15:00    1
2019-01-02 20:30:00    1
2019-01-02 20:45:00    1
2019-01-02 21:00:00    1
2019-01-02 21:15:00    1
2019-01-02 21:30:00    1
2019-01-02 21:45:00    1
2019-01-02 22:00:00    1
2019-01-02 22:15:00    1
2019-01-02 22:30:00    1
2019-01-02 22:45:00    1
2019-01-02 23:00:00    1
2019-01-02 23:15:00    1
2019-01-02 23:30:00    1
2019-01-02 23:45:00    1
2019-01-03 00:00:00    2
Freq: 15T, Length: 193, dtype: int32
2019-01-01 00:00:00    0
2019-01-01 00:15:00    1
2019-01-01 00:30:00    1
2019-01-01 00:45:00    1
2019-01-01 01:00:00    1
2019-01-01 01:15:00    1
2019-01-01 01:30:00    1
2019-01-01 01:45:00    1
2019-01-01 02:00:00    1
2019-01-01 02:15:00    1
2019-01-01 02:30:00    1
2019-01-01 02:45:00    1
2019-01-01 03:00:00    1
2019-01-01 03:15:00    1
2019-01-01 03:30:00    1
2019-01-01 03:45:00    1
2019-01-01 04:00:00    1
2019-01-01 04:15:00    1
2019-01-01 04:30:00    1
2019-01-01 04:45:00    1
2019-01-01 05:00:00    1
2019-01-01 05:15:00    1
2019-01-01 05:30:00    1
2019-01-01 05:45:00    1
2019-01-01 06:00:00    1
2019-01-01 06:15:00    1
2019-01-01 06:30:00    1
2019-01-01 06:45:00    1
2019-01-01 07:00:00    1
2019-01-01 07:15:00    1
                      ..
2019-01-02 16:45:00    2
2019-01-02 17:00:00    2
2019-01-02 17:15:00    2
2019-01-02 17:30:00    2
2019-01-02 17:45:00    2
2019-01-02 18:00:00    2
2019-01-02 18:15:00    2
2019-01-02 18:30:00    2
2019-01-02 18:45:00    2
2019-01-02 19:00:00    2
2019-01-02 19:15:00    2
2019-01-02 19:30:00    2
2019-01-02 19:45:00    2
2019-01-02 20:00:00    2
2019-01-02 20:15:00    2
2019-01-02 20:30:00    2
2019-01-02 20:45:00    2
2019-01-02 21:00:00    2
2019-01-02 21:15:00    2
2019-01-02 21:30:00    2
2019-01-02 21:45:00    2
2019-01-02 22:00:00    2
2019-01-02 22:15:00    2
2019-01-02 22:30:00    2
2019-01-02 22:45:00    2
2019-01-02 23:00:00    2
2019-01-02 23:15:00    2
2019-01-02 23:30:00    2
2019-01-02 23:45:00    2
2019-01-03 00:00:00    2
Freq: 15T, Length: 193, dtype: int32

 

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值