栏目分类:
子分类:
返回
名师互学网用户登录
快速导航关闭
当前搜索
当前分类
子分类
实用工具
热门搜索
名师互学网 > IT > 前沿技术 > 大数据 > 大数据系统

Hive SQL 编译过程

Hive SQL 编译过程

1. 入口

Driver#compile 是编译的入口

private void compile(String command, boolean resetTaskIds, boolean deferClose) throws CommandProcessorResponse 
2. 执行的操作步骤
  1. 变量替换
  2. 语法分析
  3. 语义分析
2.2 语法分析 -> 生成抽象语法树 Driver#compile
hookRunner.runBeforeParseHook(command);

ASTNode tree;
try {
  tree = ParseUtils.parse(command, ctx);
} catch (ParseException e) {
  parseError = true;
  throw e;
} finally {
  hookRunner.runAfterParseHook(command, parseError);
}
ParseUtils#parse
 public static ASTNode parse(String command, Context ctx) throws ParseException {
    return parse(command, ctx, null);
}
ParseUtils#parse
public static ASTNode parse(
      String command, Context ctx, String viewFullyQualifiedName) throws ParseException {
    ParseDriver pd = new ParseDriver();
    ASTNode tree = pd.parse(command, ctx, viewFullyQualifiedName);
    tree = findRootNonNullToken(tree);
    handleSetColRefs(tree);
    return tree;
  }
ParseDriver#parse
public ASTNode parse(String command, Context ctx, String viewFullyQualifiedName)
      throws ParseException {
      
    HiveLexerX lexer = new HiveLexerX(new ANTLRNoCaseStringStream(command));
    TokenRewriteStream tokens = new TokenRewriteStream(lexer);
    if (ctx != null) {
      if (viewFullyQualifiedName == null) {
        // Top level query
        ctx.setTokenRewriteStream(tokens);
      } else {
        // It is a view
        ctx.addViewTokenRewriteStream(viewFullyQualifiedName, tokens);
      }
      lexer.setHiveConf(ctx.getConf());
    }
    HiveParser parser = new HiveParser(tokens);
    if (ctx != null) {
      parser.setHiveConf(ctx.getConf());
    }
    parser.setTreeAdaptor(adaptor);
    HiveParser.statement_return r = null;
    try {
      r = parser.statement();
    } catch (RecognitionException e) {
      e.printStackTrace();
      throw new ParseException(parser.errors);
    }

    if (lexer.getErrors().size() == 0 && parser.errors.size() == 0) {
      LOG.debug("Parse Completed");
    } else if (lexer.getErrors().size() != 0) {
      throw new ParseException(lexer.getErrors());
    } else {
      throw new ParseException(parser.errors);
    }

    ASTNode tree = (ASTNode) r.getTree();
    tree.setUnknownTokenBoundaries();
    return tree;
  }
2.3 语义分析
// PreAnalyzeHook

baseSemanticAnalyzer sem = SemanticAnalyzerFactory.get(queryState, tree);

sem.analyze(tree, ctx);

// AfterAnalyzeHook
SemanticAnalyzerFactory.get
public static baseSemanticAnalyzer get(QueryState queryState, ASTNode tree) throws SemanticException {
    baseSemanticAnalyzer sem = getInternal(queryState, tree);
    if(queryState.getHiveOperation() == null) {
      String query = queryState.getQueryString();
      if(query != null && query.length() > 30) {
        query = query.substring(0, 30);
      }
      String msg = "Unknown HiveOperation for query='" + query + "' queryId=" + queryState.getQueryId();
      //throw new IllegalStateException(msg);
      LOG.debug(msg);
    }
    return sem;
  }
SemanticAnalyzerFactory.getInternal

如果启动基于代价的优化,则使用 CalcitePlanner,否则使用 SemanticAnalyzer。

 private static baseSemanticAnalyzer getInternal(QueryState queryState, ASTNode tree)
      throws SemanticException {
    if (tree.getToken() == null) {
      throw new RuntimeException("Empty Syntax Tree");
    } else {
      HiveOperation opType = commandType.get(tree.getType());
      queryState.setCommandType(opType);
      switch (tree.getType()) {
     
      default: {  // Query
        SemanticAnalyzer semAnalyzer = HiveConf
            .getBoolVar(queryState.getConf(), HiveConf.ConfVars.HIVE_CBO_ENABLED) ?
                new CalcitePlanner(queryState) : new SemanticAnalyzer(queryState);
        return semAnalyzer;
      }
      }
    }
  }
转载请注明:文章转载自 www.mshxw.com
本文地址:https://www.mshxw.com/it/303963.html
我们一直用心在做
关于我们 文章归档 网站地图 联系我们

版权所有 (c)2021-2022 MSHXW.COM

ICP备案号:晋ICP备2021003244-6号