超算平台安装cartopy及提交作业时出现无法读取fearures解决办法

超算平台安装cartopy及提交作业时出现无法读取fearures解决办法

cartopy的安装

最为简单的方法为conda install -c conda-forge cartopy,但奇怪的是我在曙光平台上这样运行时一直solving environment无法完成安装。所以接下来尝试自己安装相关依赖库,并最终用pip安装cartopy

总共有6个依赖库,pyproj,Pillow,pyshp,Shapely,PROJ,GEOS。
其中前四个可以直接使用pip进行安装,注意shapely安装时,版本最好不要高于2.0.0,不然容易产生Import Error: lgeos
PROJ和GEOS是两个非python库,需要采用其他方法安装

pip install numpy -i https://pypi.doubanio.com/simple/
pip install matplotlib -i https://pypi.doubanio.com/simple/
pip install pyproj -i https://pypi.doubanio.com/simple/
pip install pillow -i https://pypi.doubanio.com/simple/
pip install pyshp -i https://pypi.doubanio.com/simple/
# shapely最低要求版本可在官网查看,最好不要太高
# https://scitools.org.uk/cartopy/docs/latest/installing.html
pip install shapely==1.6.4 -i https://pypi.doubanio.com/simple/

PROJ和GEOS的安装可参考其官网
GEOS
PROJ
如果在超算上,无法有管理员权限所以很难安装。可以尝试用conda来安装

conda install -c conda-forge GEOS
conda install -c conda-forge PROJ==8.0.0

最后安装好以上后,可以直接使用pip进行安装

pip install cartopy -i https://pypi.doubanio.com/simple/

Note: 如果这一步出错,通常会显示PROJ或GEOS版本不对。这个时候使用pip uninstall删除掉geos和proj后,采用conda命令指定最低版本进行安装,再进行cartopy的安装即可成功。
安装成功后可运行以下程序检验是否安装成功。

import matplotlib.pyplot as plt
import cartopy.crs as ccrs

plt.figure(figsize=(6, 3))
ax = plt.axes(projection=ccrs.PlateCarree(central_longitude=180))
ax.coastlines(resolution='110m')
ax.gridlines()
plt.savefig('test.png')

超算计算节点无法联网导致coastline等无法作图

有时数据量过大,需要提交作业到计算节点进行画图时,会出现URLError等问题,这时需要我们手动下载相关数据。
一般来说数据储存在~/.local/share/cartopy'下,其github给出了下载的代码,这里直接放上来,将代码复制保存为feature_download.py

#!/usr/bin/env python
# Copyright Cartopy Contributors
#
# This file is part of Cartopy and is released under the LGPL license.
# See COPYING and COPYING.LESSER in the root of the repository for full
# licensing details.
"""
This module provides a command-line tool for triggering the download of
the data used by various Feature instances.
For detail on how to use this tool, execute it with the `-h` option:
    python cartopy_feature_download.py -h
"""

import argparse
import pathlib

from cartopy import config
from cartopy.feature import Feature, GSHHSFeature, NaturalEarthFeature
from cartopy.io import Downloader, DownloadWarning


ALL_SCALES = ('110m', '50m', '10m')

# See https://github.com/SciTools/cartopy/pull/1833
URL_TEMPLATE = ('https://naturalearth.s3.amazonaws.com/{resolution}_'
                '{category}/ne_{resolution}_{name}.zip')
SHP_NE_SPEC = ('shapefiles', 'natural_earth')

FEATURE_DEFN_GROUPS = {
    # Only need one GSHHS resolution because they *all* get downloaded
    # from one file.
    'gshhs': GSHHSFeature(scale='f'),
    'physical': (
        ('physical', 'coastline', ALL_SCALES),
        ('physical', 'land', ALL_SCALES),
        ('physical', 'ocean', ALL_SCALES),
        ('physical', 'rivers_lake_centerlines', ALL_SCALES),
        ('physical', 'lakes', ALL_SCALES),
        ('physical', 'geography_regions_polys', ALL_SCALES),
        ('physical', 'geography_regions_points', ALL_SCALES),
        ('physical', 'geography_marine_polys', ALL_SCALES),
        ('physical', 'glaciated_areas', ALL_SCALES),
        ('physical', 'antarctic_ice_shelves_polys', ('50m', '10m'))
    ),
    'cultural': (
        ('cultural', 'admin_0_countries', ALL_SCALES),
        ('cultural', 'admin_0_countries_lakes', ALL_SCALES),
        ('cultural', 'admin_0_sovereignty', ALL_SCALES),
        ('cultural', 'admin_0_boundary_lines_land', ALL_SCALES),

        ('cultural', 'urban_areas', ('50m', '10m')),

        ('cultural', 'roads', '10m'),
        ('cultural', 'roads_north_america', '10m'),
        ('cultural', 'railroads', '10m'),
        ('cultural', 'railroads_north_america', '10m'),
    ),
    'cultural-extra': (
        ('cultural', 'admin_0_map_units', '110m'),
        ('cultural', 'admin_0_scale_rank', '110m'),
        ('cultural', 'admin_0_tiny_countries', '110m'),
        ('cultural', 'admin_0_pacific_groupings', '110m'),
        ('cultural', 'admin_1_states_provinces', '110m'),
        ('cultural', 'admin_1_states_provinces_lines', '110m'),
        ('cultural', 'admin_1_states_provinces_lakes', ALL_SCALES),
    ),
}


def download_features(group_names, dry_run=True):
    for group_name in group_names:
        feature_defns = FEATURE_DEFN_GROUPS[group_name]
        if isinstance(feature_defns, Feature):
            feature = feature_defns
            level = list(feature._levels)[0]
            downloader = Downloader.from_config(('shapefiles', 'gshhs',
                                                 feature._scale, level))
            format_dict = {'config': config, 'scale': feature._scale,
                           'level': level}
            if dry_run:
                print(f'URL: {downloader.url(format_dict)}')
            else:
                downloader.path(format_dict)
                geoms = list(feature.geometries())
                print(f'Feature {feature} length: {len(geoms)}')
        else:
            for category, name, scales in feature_defns:
                if not isinstance(scales, tuple):
                    scales = (scales,)
                for scale in scales:
                    downloader = Downloader.from_config(('shapefiles',
                                                         'natural_earth',
                                                         scale, category,
                                                         name))
                    feature = NaturalEarthFeature(category, name, scale)
                    format_dict = {'config': config, 'category': category,
                                   'name': name, 'resolution': scale}
                    if dry_run:
                        print(f'URL: {downloader.url(format_dict)}')
                    else:
                        downloader.path(format_dict)
                        geoms = list(feature.geometries())
                        print('Feature {}, {}, {} length: {}'
                              ''.format(category, name, scale, len(geoms)))


if __name__ == '__main__':
    parser = argparse.ArgumentParser(description='Download feature datasets.')
    parser.add_argument('group_names', nargs='+',
                        choices=FEATURE_DEFN_GROUPS,
                        metavar='GROUP_NAME',
                        help='Feature group name: %(choices)s')
    parser.add_argument('--output', '-o',
                        help='save datasets in the specified directory '
                             '(default: user cache directory)')
    parser.add_argument('--dry-run',
                        help='just print the URLs to download',
                        action='store_true')
    parser.add_argument('--ignore-repo-data', action='store_true',
                        help='ignore existing repo data when downloading')
    parser.add_argument('--no-warn',
                        action='store_true',
                        help='ignore cartopy "DownloadWarning" warnings')
    args = parser.parse_args()

    if args.output:
        target_dir = pathlib.Path(args.output).expanduser().resolve()
        target_dir.mkdir(parents=True, exist_ok=True)
        config['pre_existing_data_dir'] = target_dir
        config['data_dir'] = target_dir
    if args.ignore_repo_data:
        config['repo_data_dir'] = config['data_dir']
    if args.no_warn:
        import warnings
        warnings.filterwarnings('ignore', category=DownloadWarning)

    # Enforce use of stable AWS endpoint, regardless of cartopy version.
    # In doing so, this allows users to download this script and execute it
    # with any version of cartopy, thus taking advantage of the stable AWS
    # endpoint.
    # This removes the need to backport the associated fix
    # https://github.com/SciTools/cartopy/pull/1833.
    config['downloaders'][SHP_NE_SPEC].url_template = URL_TEMPLATE

    download_features(args.group_names, dry_run=args.dry_run)

使用方法:python feature_download.py
可以通过-h查看相关参数

$> python feature_download.py --help
usage: feature_download.py [-h] [--output OUTPUT] [--dry-run] [--ignore-repo-data] GROUP_NAME [GROUP_NAME ...]

Download feature datasets.

positional arguments:
  GROUP_NAME            Feature group name: cultural-extra, cultural, gshhs, physical

optional arguments:
  -h, --help            show this help message and exit
  --output OUTPUT, -o OUTPUT
                        save datasets in the specified directory (default: user cache directory)
  --dry-run             just print the URLs to download
  --ignore-repo-data    ignore existing repo data when downloading

一般不用指定下载输出文件夹,所以可用以下命令直接把所有数据下全

python feature_download.py physical
python feature_download.py cultural-extra
python feature_download.py cultural
python feature_download.py gshhs
  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 2
    评论
评论 2
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值