polar使用的一些小经验
- polar买回来之后,最好在电脑上下载一下官方的软件,同步一下设置,这个时候polar的语言和时区一般会变成本地的,而且可以避免一直是运动模式,耗电量太高,导致需要经常充电。
- polar需要定期同步数据到官网上,可以使用APP或者连接到电脑的同步软件上,建议使用电脑同步,手机APP同步很容易断,需要尝试很多次。只有进行过数据同步之后,新的运动数据才能够在官网上看到。
- 平时就长时间佩戴就好,运动的时候设置相应的运动模式。
这是我个人的一些小经验,仅供参考。
下载安装python和chrome
由于爬虫和数据展示需要使用到python,所以,需要下载python并安装相应的模块。需要使用chrome浏览器对polar官网的网页进行分析,辅助实现爬取数据。
-
python
需要python就使用anaconda好了,集成环境,配置简单。具体安装的步骤,可以参考我以前的一篇文章:https://zhuanlan.zhihu.com/p/186223176
-
plotly
在原先的文章中,第二部分装了一个backtrader,这次我们使用pip install plotly 这个命令,安装一下plotly模块
-
chrome浏览器
去百度一下,下载chrome浏览器,最新的几个版本都行,没特别要求。
使用chrome登录polar网站
在chrome浏览器中打开网站https://flow.polar.com/,并输入账号和密码,登录,之后会出现如下的界面。
获取睡眠信息
- 首先,在登录过polar的页面,点击鼠标右键,点击检查,在左上角,找到network,点击;然后点击睡眠,可以得到很多的网络信息,需要我们从中找到究竟是哪一个网址提供相应的信息。想要直到是哪一个网址返回得到信息,需要一个个点开这个网址,并且点击response,查看是否有我们想要的信息。这一步不需要我们自己去抓包,我已经找到相应的网址了:https://sleep.flow-prd.api.polar.com/api/sleep/nights/nearby?date=2021-01-24
-
需要根据自己浏览器上面的信息,配置headers和data
headers = { "authority": "sleep.flow-prd.api.polar.com", "accept": "application/json, text/javascript, */*; q=0.01", "accept-encoding": "gzip, deflate, br", "accept-language": "zh-CN,zh;q=0.9", "content-type": "application/json", "cookie": "POLAR_SSO=1; POLAR_SESSION=eyJhbGciOiJSUzI1NiJ9.eyJzY29wZXMiOlsiUE9MQVJfU1NPIl0sImV4cCI6MTY0Mjg1Njc2NywiaWF0IjoxNjExMzIwNzY3LCJ1c2VySWQiOjQzMzg3MTk0LCJ1dWlkIjoiYWQ2MGRhZDEtN2VlYi00YWNiLWI2ZmEtMjE4ZTE5MTFlNTM1In0.xwtiI0Irb-1JdPNSKdbbutzNsVCqlLGkPBfT-FQ5RrHWiTmJLmtkKEAbi5cGj5NDG8l1W45hh6meAv4HOkbTopqJm6SJYzWazv0i0kN5vfsKkYWoHHg8jFDyAOR_QkUra5SFEPKvA3i-N9tFxP_z3HWDQX4Lvw78eLxpkRaJmhg9V7PcgxAPK5DNfDMi6TIEdIGA9zyYpVznJw6mW370tOSUVtc10SY2Ynfk-DUBcDalw2ewnMQ3DAI-U7HrZ7G_frBJxxL0QTnzlJ0vwCChR4Krl9ikk0zb8v8hlCZFjEtAW858oli_2Sxo0X_E3BN6zVvrnr1cH_iqjh0wBcEt7g; JSESSIONID=09C074C68BA81D445CC6592BCB47A8B6; OptanonAlertBoxClosed=2021-01-22T13:08:07.595Z; OptanonConsent=isIABGlobal=false&datestamp=Fri+Jan+22+2021+21%3A08%3A07+GMT%2B0800+(%E4%B8%AD%E5%9B%BD%E6%A0%87%E5%87%86%E6%97%B6%E9%97%B4)&version=6.6.0&hosts=&landingPath=NotLandingPage&groups=C0004%3A1%2CC0001%3A1%2CC0002%3A1%2CC0003%3A1&AwaitingReconsent=false; _ga=GA1.2.557689735.1611320888; _gid=GA1.2.1639346864.1611320888; _gat=1; _dc_gtm_UA-66185860-2=1", "origin": "https://flow.polar.com", "user-agent": "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/87.0.4280.88 Safari/537.36", "x-requested-with": "XMLHttpRequest" }
这个headers的信息,需要从自己浏览器中的Request headers 里面去复制,然后替换成自己的内容,基本上是需要修改cookie,其他如果不一样,也建议修改一下。使用下面的代码就可以获取从2020年1月1日到2020年12月31日的睡眠数据了,想要获取更多的睡眠数据,可以修改这两个日期。
import requests import pandas as pd import numpy as np import re # 获取睡眠数据 result = [] url = "https://sleep.flow-prd.api.polar.com/api/sleep/nights/nearby" headers = { "authority": "sleep.flow-prd.api.polar.com", "accept": "application/json, text/javascript, */*; q=0.01", "accept-encoding": "gzip, deflate, br", "accept-language": "zh-CN,zh;q=0.9", "content-type": "application/json", "cookie": "POLAR_SSO=1; POLAR_SESSION=eyJhbGciOiJSUzI1NiJ9.eyJzY29wZXMiOlsiUE9MQVJfU1NPIl0sImV4cCI6MTY0Mjg1Njc2NywiaWF0IjoxNjExMzIwNzY3LCJ1c2VySWQiOjQzMzg3MTk0LCJ1dWlkIjoiYWQ2MGRhZDEtN2VlYi00YWNiLWI2ZmEtMjE4ZTE5MTFlNTM1In0.xwtiI0Irb-1JdPNSKdbbutzNsVCqlLGkPBfT-FQ5RrHWiTmJLmtkKEAbi5cGj5NDG8l1W45hh6meAv4HOkbTopqJm6SJYzWazv0i0kN5vfsKkYWoHHg8jFDyAOR_QkUra5SFEPKvA3i-N9tFxP_z3HWDQX4Lvw78eLxpkRaJmhg9V7PcgxAPK5DNfDMi6TIEdIGA9zyYpVznJw6mW370tOSUVtc10SY2Ynfk-DUBcDalw2ewnMQ3DAI-U7HrZ7G_frBJxxL0QTnzlJ0vwCChR4Krl9ikk0zb8v8hlCZFjEtAW858oli_2Sxo0X_E3BN6zVvrnr1cH_iqjh0wBcEt7g; JSESSIONID=09C074C68BA81D445CC6592BCB47A8B6; OptanonAlertBoxClosed=2021-01-22T13:08:07.595Z; OptanonConsent=isIABGlobal=false&datestamp=Fri+Jan+22+2021+21%3A08%3A07+GMT%2B0800+(%E4%B8%AD%E5%9B%BD%E6%A0%87%E5%87%86%E6%97%B6%E9%97%B4)&version=6.6.0&hosts=&landingPath=NotLandingPage&groups=C0004%3A1%2CC0001%3A1%2CC0002%3A1%2CC0003%3A1&AwaitingReconsent=false; _ga=GA1.2.557689735.1611320888; _gid=GA1.2.1639346864.1611320888; _gat=1; _dc_gtm_UA-66185860-2=1", "origin": "https://flow.polar.com", "user-agent": "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/87.0.4280.88 Safari/537.36", "x-requested-with": "XMLHttpRequest" } datetime_list = list(pd.date_range(start='2020-01-01',end='2020-12-31')) datetime_list =[i.strftime("%Y-%m-%d") for i in datetime_list] for datetime_ in datetime_list: data = {"date":datetime_} r = requests.get(url,headers = headers,params=data) content = r.json() date = content['previousNights'][0]['nightSleep']['night'] sleep_start_time = content['previousNights'][0]['nightSleep']['sleepStartTime'] sleep_end_time = content['previousNights'][0]['nightSleep']['sleepEndTime'] sleep_time = content['previousNights'][0]['sleepEvaluationData']['asleep'] efficiencyPercent = content['previousNights'][0]['sleepEvaluationData']['efficiencyPercent'] continuityIndex = content['previousNights'][0]['sleepEvaluationData']['continuityIndex'] result.append([date,sleep_start_time,sleep_end_time,sleep_time,efficiencyPercent,continuityIndex]) print(date,sleep_start_time,sleep_end_time,sleep_time,efficiencyPercent,continuityIndex) sleep_df = pd.DataFrame(result) sleep_df.columns = ['date','sleep_start_time','sleep_end_time','sleep_time','efficiencyPercent','continuityIndex'] sleep_df.to_csv("2020年睡眠数据.csv")
获取运动数据
- 获取运动数据的方法和获取睡眠数据的方法类似,也是要知道是哪一个url返回的信息,运动信息返回的网址是"https://flow.polar.com/activity/data/1.1.2020/31.12.2020?_=1611325770705",大家可以根据需要,修改两个日期,以获取他们之间得到数据,比如,这个url的开始日期是1.1.2020,结束日期是31.12.2020,大家可以根据自己的需要进行修改。
- headers的信息,大家要根据自己浏览器中的进行修改。
# 爬去2020年的运动数据
import requests
import pandas as pd
import numpy as np
import re
result = []
url = "https://flow.polar.com/activity/data/1.1.2020/31.12.2020?_=1611325770705"
headers = {
"authority": "sleep.flow-prd.api.polar.com",
"accept": "application/json, text/javascript, */*; q=0.01",
"accept-encoding": "gzip, deflate, br",
"accept-language": "zh-CN,zh;q=0.9",
"content-type": "application/json",
"cookie": "POLAR_SSO=1; POLAR_SESSION=eyJhbGciOiJSUzI1NiJ9.eyJzY29wZXMiOlsiUE9MQVJfU1NPIl0sImV4cCI6MTY0Mjg1Njc2NywiaWF0IjoxNjExMzIwNzY3LCJ1c2VySWQiOjQzMzg3MTk0LCJ1dWlkIjoiYWQ2MGRhZDEtN2VlYi00YWNiLWI2ZmEtMjE4ZTE5MTFlNTM1In0.xwtiI0Irb-1JdPNSKdbbutzNsVCqlLGkPBfT-FQ5RrHWiTmJLmtkKEAbi5cGj5NDG8l1W45hh6meAv4HOkbTopqJm6SJYzWazv0i0kN5vfsKkYWoHHg8jFDyAOR_QkUra5SFEPKvA3i-N9tFxP_z3HWDQX4Lvw78eLxpkRaJmhg9V7PcgxAPK5DNfDMi6TIEdIGA9zyYpVznJw6mW370tOSUVtc10SY2Ynfk-DUBcDalw2ewnMQ3DAI-U7HrZ7G_frBJxxL0QTnzlJ0vwCChR4Krl9ikk0zb8v8hlCZFjEtAW858oli_2Sxo0X_E3BN6zVvrnr1cH_iqjh0wBcEt7g; JSESSIONID=09C074C68BA81D445CC6592BCB47A8B6; OptanonAlertBoxClosed=2021-01-22T13:08:07.595Z; OptanonConsent=isIABGlobal=false&datestamp=Fri+Jan+22+2021+21%3A08%3A07+GMT%2B0800+(%E4%B8%AD%E5%9B%BD%E6%A0%87%E5%87%86%E6%97%B6%E9%97%B4)&version=6.6.0&hosts=&landingPath=NotLandingPage&groups=C0004%3A1%2CC0001%3A1%2CC0002%3A1%2CC0003%3A1&AwaitingReconsent=false; _ga=GA1.2.557689735.1611320888; _gid=GA1.2.1639346864.1611320888; _gat=1; _dc_gtm_UA-66185860-2=1",
"origin": "https://flow.polar.com",
"user-agent": "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/87.0.4280.88 Safari/537.36",
"x-requested-with": "XMLHttpRequest"
}
# data = {"date":datetime_}
r = requests.get(url,headers = headers)
content = r.json()
# 整理数据
result = []
for i in content['data']:
datetime = i['summaries'][0]['datetime']
goalPercent = i['summaries'][0]['goalPercent']
calories = i['summaries'][0]['calories']
stepCount = i['summaries'][0]['stepCount']
distanceFromSteps = i['summaries'][0]['distanceFromSteps']
activeTime = i['summaries'][0]['activeTime']
inActiveTime = i['summaries'][0]['inActiveTime']
sleepTime = i['summaries'][0]['sleepTime']
sleepQuality = i['summaries'][0]['sleepQuality']
result.append([datetime,goalPercent,calories,stepCount,distanceFromSteps,activeTime,inActiveTime,sleepTime,sleepQuality])
# 保存数据
run_df = pd.DataFrame(result)
run_df.columns = ['datetime','goalPercent','calories','stepCount','distanceFromSteps','activeTime','inActiveTime','sleepTime','sleepQuality']
run_df.to_csv("2020年运动数据.csv")
如果不出意外的话,睡眠数据和运动数据都会保存到本地工作目录中了。
智慧、心灵、财富,总要有一个在路上,愿我们能在人生的道路上,不断成长、不断成熟~~~
感兴趣可以关注我的专栏:
my_quant_study_note:分享一些关于量化投资、量化交易相关的思考
backtrader量化投资回测与交易:本专栏免费,分享backtrader相关的内容。
量化投资神器-backtrader源码解析-从入门到精通:本专栏目前收费99元,预计更新100篇策略+20篇backtrader讲解+80篇源代码分析。