IDEA构建Yarn运行环境

本地连接hadoop yarn环境进行本地开发非常方便, 免除了开发一段代码, 打包部署到Linux开发环境去调试这一段的麻烦. 本文将作者实验通过的几种方式做一下记录.

方法一:

整体目录结构:
在这里插入图片描述
A). pom.xml 清单

<?xml version="1.0" encoding="UTF-8"?>

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>

  <groupId>com.david</groupId>
  <artifactId>yarnstatusgetter</artifactId>
  <version>1.0-SNAPSHOT</version>

  <name>yarnstatusgetter</name>

  <properties>
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
    <maven.compiler.source>1.8</maven.compiler.source>
    <maven.compiler.target>1.8</maven.compiler.target>
  </properties>

  <dependencies>
    <dependency>
      <groupId>junit</groupId>
      <artifactId>junit</artifactId>
      <version>3.8.1</version>
      <scope>test</scope>
    </dependency>

    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-mapreduce-client-core</artifactId>
      <version>2.6.0</version>
    </dependency>
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-hdfs</artifactId>
      <version>2.6.0</version>
    </dependency>
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-common</artifactId>
      <version>2.6.0</version>
    </dependency>

    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-yarn-common</artifactId>
      <version>2.6.0</version>
    </dependency>
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-yarn-client</artifactId>
      <version>2.6.0</version>
    </dependency>
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-yarn-server-resourcemanager</artifactId>
      <version>2.6.0</version>
    </dependency>

  </dependencies>

  <build>
    <pluginManagement><!-- lock down plugins versions to avoid using Maven defaults (may be moved to parent pom) -->
      <plugins>
        <plugin>
          <groupId>org.apache.maven.plugins</groupId>
          <artifactId>maven-surefire-plugin</artifactId>
          <version>2.13</version>
          <configuration>
            <useFile>false</useFile>
            <disableXmlReport>true</disableXmlReport>
            <!-- If you have classpath issue like NoDefClassError,... -->
            <!-- useManifestOnlyJar>false</useManifestOnlyJar -->
            <includes>
              <include>**/*Test.*</include>
              <include>**/*Suite.*</include>
            </includes>
          </configuration>
        </plugin>

        <!-- 打包依赖包到jar中 -->
        <plugin>
          <groupId>org.apache.maven.plugins</groupId>
          <artifactId>maven-assembly-plugin</artifactId>
          <version>2.4.1</version>
          <configuration>
            <!-- get all project dependencies -->
            <descriptorRefs>
              <descriptorRef>jar-with-dependencies</descriptorRef>
            </descriptorRefs>
            <!-- MainClass in mainfest make a executable jar -->
          </configuration>
          <executions>
            <execution>
              <id>make-assembly</id>
              <!-- bind to the packaging phase -->
              <phase>package</phase>
              <goals>
                <goal>single</goal>
              </goals>
            </execution>
          </executions>
        </plugin>
      </plugins>
    </pluginManagement>
  </build>
</project>

B) 加载 YARN / HDFS配置文件
拷贝YARN / HDFS相关配置文件到maven项目的resources目录下:

  • core-site.xml
  • hdfs-site.xml
  • mapred-site.xml
  • yarn-site.xml

CDH集群环境下,可以登录到ClouderaManager中, 分别进入HDFS/YARN配置页面, 依次点击"设置","下载客户端配置"后获取.

C) Yarn测试代码

测试案例功能说明:
检查当前Yarn下正在运行的app列表:

package com.david;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.yarn.api.records.ApplicationId;
import org.apache.hadoop.yarn.api.records.ApplicationReport;
import org.apache.hadoop.yarn.api.records.YarnApplicationState;
import org.apache.hadoop.yarn.api.records.impl.pb.ApplicationReportPBImpl;
import org.apache.hadoop.yarn.client.api.YarnClient;
import org.apache.hadoop.yarn.client.cli.ApplicationCLI;
import org.apache.hadoop.yarn.exceptions.ApplicationNotFoundException;
import org.apache.hadoop.yarn.exceptions.YarnException;
import org.apache.hadoop.yarn.util.ConverterUtils;

import java.io.IOException;
import java.io.OutputStreamWriter;
import java.io.PrintStream;
import java.io.PrintWriter;
import java.nio.charset.Charset;
import java.text.DecimalFormat;
import java.util.EnumSet;
import java.util.HashSet;
import java.util.List;
import java.util.Set;


public class YarnApplicationManager {

    private YarnClient client;
    protected PrintStream sout = System.out;
    private static final String PRINT_FORMAT =
            new StringBuilder(">>> ")
                    .append("%15s").append("\t")
                    .append("%15s").append("\t")
                    .append("%15s").append("\t")
                    .append("%15s").append("\t")
                    .append("%15s").append("\t")
                    .append("%15s").append("\t")
                    .append("%15s").append("\t")
                    .append("%15s").append("\t")
                    .append("%15s").append("\t")
                    .append(System.getProperty("line.separator")) //windows:/r/n; linux+mac:/r
                    .toString();

    public static void main(String[] args) {
        try {
            YarnApplicationManager app = new YarnApplicationManager();
            app.initYarnClient();

            // ①. 获取Yarn应用列表及状态
            app.getAppsState();

            // ②. 根据appId 杀掉Yarn 任务
//            String appId = "application_1584696615034_0134";
//            app.killYARNAppByID(appId);
        } catch (Exception ex) {
            ex.printStackTrace();
        }
    }

    private void initYarnClient() {
        Configuration conf = new Configuration();
        client = YarnClient.createYarnClient();
        client.init(conf);
        client.start();
    }

    private void getAppsState() {
        EnumSet<YarnApplicationState> appStates = EnumSet.noneOf(YarnApplicationState.class);
        if (appStates.isEmpty()) {
            appStates.add(YarnApplicationState.RUNNING);
            appStates.add(YarnApplicationState.ACCEPTED);
            appStates.add(YarnApplicationState.SUBMITTED);
        }
        List<ApplicationReport> appsReport = null;
        try {
            appsReport = client.getApplications(appStates);
        } catch (YarnException e) {
            e.printStackTrace();
        } catch (IOException e) {
            e.printStackTrace();
        }

        PrintWriter writer = new PrintWriter(new OutputStreamWriter(sout, Charset.forName("UTF-8")));
        Set<String> appNameSet = new HashSet<>();

        for (ApplicationReport appReport : appsReport) {
            ApplicationReportPBImpl app = (ApplicationReportPBImpl) appReport;

            //Format number of app's execute progress bar.
            DecimalFormat decimalFormat = new DecimalFormat("###.##%");
            String progress = decimalFormat.format(appReport.getProgress());
            writer.printf(PRINT_FORMAT, appReport.getApplicationId(), appReport.getName(),
                    appReport.getApplicationType(), appReport.getUser(), appReport.getQueue(),
                    appReport.getYarnApplicationState(), appReport.getFinalApplicationStatus(), progress,
                    appReport.getOriginalTrackingUrl());
            appNameSet.add(app.getName());
        }
        writer.flush();
        for (ApplicationReport appReport : appsReport) {
            String type = appReport.getApplicationType();
            if (type.equalsIgnoreCase("mapreduce")) {
                continue;
            }
        }

        boolean isSparkStreamingProcessLives = judgeSparkStreamingStatus(appNameSet);
        System.out.println("isSparkStreamingProcessLives = " + isSparkStreamingProcessLives);
    }

    private boolean judgeSparkStreamingStatus(Set<String> appNameSet) {
        return appNameSet.contains("Spark shell");
    }

    private void killYARNAppByID(String applicationId) throws YarnException, IOException {
        ApplicationId appId = ConverterUtils.toApplicationId(applicationId);
        ApplicationReport appReport = null;
        try {
            appReport = client.getApplicationReport(appId);
        } catch (ApplicationNotFoundException e) {
            sout.println("Application with id '" + applicationId +
                    "' doesn't exist in RM.");
            throw e;
        }

        if (appReport.getYarnApplicationState() == YarnApplicationState.FINISHED
                || appReport.getYarnApplicationState() == YarnApplicationState.KILLED
                || appReport.getYarnApplicationState() == YarnApplicationState.FAILED) {
            sout.println("Application " + applicationId + " has already finished ");
        } else {
            sout.println("Killing application " + applicationId);
            client.killApplication(appId);
        }
    }

    private void getAppState() throws Exception {
        String[] args = {"-list"};
        ApplicationCLI.main(args);
    }

    private void releaseYarnClient() {
        if (null != client) {
            try {
                client.close();
            } catch (IOException e) {
                e.printStackTrace();
            }
        }
    }

}

D). 执行maven打包

打包命令:
mvn assembly:assembly

E). 上传到服务器运行

按以下语法执行:
java -cp .:/PATH1/*.jar:/PATH2/*.jar PackageName.MainClassName

如本例中:
java -cp yarnstatusgetter-1.0-SNAPSHOT-jar-with-dependencies.jar com.david.YarnApplicationManager

方法二

如果方法一不成功, 尝试使用方法二解决.

A). Hadoop官网下载与所在集群版本适配的安装包
如:
在这里插入图片描述
B). 将winutils.exe 放在下载解压的hadoop压缩包的bin目录下
如:
在这里插入图片描述
winutils.exe下载地址:
目前在github上维护着一个名称为winutils的项目,地址为:
https://github.com/SirMin/winutils
找到指定版本的hadoop版本的bin目录下, 下载该文件即可,如下:
在这里插入图片描述
C). 重启PC
D). 再次在IDEA中尝试运行连接YARN环境的代码.

  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
要在IDEA中使用yarn运行Vue项目,你需要按照以下步骤进行操作: 1. 首先,确保你已经安装了yarn。你可以使用以下命令来安装yarn并配置淘宝镜像: ``` npm install -g yarn --registry=https://registry.npm.taobao.org yarn config set registry https://registry.npm.taobao.org -g ``` 2. 接下来,你需要下载Vue.js插件以解析Vue文件。在IDEA中,你可以通过以下步骤来下载插件: - 打开IDEA,进入插件市场(Marketplace)。 - 搜索并下载Vue.js插件。 - 下载完成后,重启IDEA以使插件生效。 3. 确保你的项目中已经安装了webpack。你可以使用以下命令来安装指定版本的webpack: ``` npm install webpack@^4.10.0 --save-dev ``` 4. 最后,你可以使用yarn来启动本地工程。在项目目录下,使用以下命令: ``` yarn run serve ``` 这样,你就可以使用yarnIDEA运行Vue项目了。请注意,以上步骤假设你已经正确配置了Vue项目的相关文件和依赖项。 #### 引用[.reference_title] - *1* *2* [IDEA打开启动Vue项目和Vue文件](https://blog.csdn.net/weixin_46713508/article/details/125442234)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v91^control_2,239^v3^insert_chatgpt"}} ] [.reference_item] - *3* [idea 导入vue项目 并运行](https://blog.csdn.net/weixin_43525284/article/details/126462868)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v91^control_2,239^v3^insert_chatgpt"}} ] [.reference_item] [ .reference_list ]

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值