栏目分类:
子分类:
返回
名师互学网用户登录
快速导航关闭
当前搜索
当前分类
子分类
实用工具
热门搜索
名师互学网 > IT > 前沿技术 > 大数据 > 大数据系统

storm简单实例+如何将storm任务迁移至flink处理研究

storm简单实例+如何将storm任务迁移至flink处理研究

基础storm程序示例

Storm的流处理主要就是通过Spout和Bolt节点进行处理,可以继承这些类写自己的逻辑

public class FlinkStormDemo {

    public static void main(String[] args) {
        //1.创建执行环境
        LocalCluster stormCluster = new LocalCluster();
        TopologyBuilder builder = new TopologyBuilder();
        //2.创建一个初始的数据源
        builder.setSpout("word", new WordSpout());
        //3.对数据源进行第一次加工
        builder.setBolt("word-1",new WordBolt1(), 1).shuffleGrouping("word");
        //4.对数据源进行第二次加工
        builder.setBolt("word-2",new WordBolt2(), 1).shuffleGrouping("word-1");
        //5.配置一些参数
        Config config = new Config();
        config.setDebug(true);
        //6.提交storm任务并处理
        stormCluster.submitTopology("storm-task", config, builder.createTopology());
    }

    static class WordSpout extends baseRichSpout {

        private SpoutOutputCollector spoutOutputCollector;

        @Override
        public void open(Map map, TopologyContext topologyContext, SpoutOutputCollector spoutOutputCollector) {
            this.spoutOutputCollector = spoutOutputCollector;
        }

        @Override
        public void nextTuple() {
            try {
                Thread.sleep(10000);
            } catch (InterruptedException e) {
                e.printStackTrace();
            }
            System.out.println("数据初始化中....");
            String initData = "abc";
            spoutOutputCollector.emit(new Values(initData));
        }

        @Override
        public void declareOutputFields(OutputFieldsDeclarer outputFieldsDeclarer) {
            outputFieldsDeclarer.declare(new Fields("word"));
        }
    }

    static class WordBolt1 extends baseRichBolt {

        private OutputCollector collector;

        @Override
        public void prepare(Map map, TopologyContext topologyContext, OutputCollector outputCollector) {
            this.collector = outputCollector;
        }

        @Override
        public void execute(Tuple tuple) {
            System.out.println("数据第1次处理中....");
            //给上次获取的单词拼接上def
            collector.emit(tuple, new Values(tuple.getString(0) + "def"));
            collector.ack(tuple);
        }

        @Override
        public void declareOutputFields(OutputFieldsDeclarer outputFieldsDeclarer) {
            outputFieldsDeclarer.declare(new Fields("word"));
        }
    }


    static class WordBolt2 extends baseRichBolt {

        private OutputCollector collector;

        @Override
        public void prepare(Map map, TopologyContext topologyContext, OutputCollector outputCollector) {
            this.collector = outputCollector;
        }

        @Override
        public void execute(Tuple tuple) {
            System.out.println("数据第2次处理中....");
            //输出处理结果
            System.out.println("处理结果:" + tuple.getString(0));
            collector.ack(tuple);
        }

        @Override
        public void declareOutputFields(OutputFieldsDeclarer outputFieldsDeclarer) {

        }
    }
}

执行结果:

利用flink-storm程序实现类似功能

需要更改flink相关依赖的版本到1.7.0,主要依赖了flink-storm的jar包

	
      org.apache.flink
      flink-java
      1.7.0
    
    
      org.apache.flink
      flink-streaming-java_2.11
      1.7.0
    
    
      org.apache.flink
      flink-clients_2.11
      1.7.0
    
    
      org.apache.flink
      flink-storm_2.11
      1.7.0
    

只需要修改两处:

LocalCluster替换为FlinkLocalCluster,处理的任务从TopologyBuilder

.createTopology替换为FlinkTopology.createTopology(TopologyBuilder)

执行结果:

利用flink程序实现类似功能

利用Kafka发送初始消息“测试数据”。

public class FlinkProducer {

    public static void main(String[] args) throws Exception {
        StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
        Properties properties = new Properties();
        properties.put("bootstrap.servers", "10.225.173.107:9092,10.225.173.108:9092,10.225.173.109:9092");
        FlinkKafkaProducer flinkKafkaProducer = new FlinkKafkaProducer<>("flink", new SimpleStringSchema(), properties);
        DataStreamSource source = env.fromElements("测试数据");
        source.addSink(flinkKafkaProducer);
        env.execute();
    }

}

接收Kafka的初始消息“测试数据”并加工处理。

public class FlinkConsumer {

    public static void main(String[] args) throws Exception {
        //1.创建执行环境
        StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
        //2.配置并创建初始数据源
        Properties properties = new Properties();
        properties.put("bootstrap.servers", "10.225.173.107:9092,10.225.173.108:9092,10.225.173.109:9092");
        FlinkKafkaConsumer flinkKafkaConsumer = new FlinkKafkaConsumer<>("flink", new SimpleStringSchema(), properties);
        DataStreamSource source = env.addSource(flinkKafkaConsumer);
        //3.对数据源进行连续处理
        source.process(new FlinkBolt1()).
               process(new FlinkBolt2()).
               process(new FlinkBolt3());
        //4.执行flink程序
        env.execute();
    }

    static class FlinkBolt1 extends ProcessFunction {

        @Override
        public void open(Configuration parameters) {
            //开始第1次处理....
        }

        @Override
        public void processElement(String s, ProcessFunction.Context context, Collector collector) throws Exception {
            System.out.println("第1次处理前的值是:" + s);
            s += "abc";
            System.out.println("第1次处理后的值是:" + s);
            collector.collect(s);
        }

        @Override
        public void close() throws Exception {
            //结束第1次处理
        }
    }

    static class FlinkBolt2 extends ProcessFunction {

        @Override
        public void open(Configuration parameters) {
            //开始第2次处理....
        }

        @Override
        public void processElement(Object s, ProcessFunction.Context context, Collector collector) throws Exception {
            s = s.toString();
            System.out.println("第2次处理前的值是:" + s);
            s += "def";
            System.out.println("第2次处理后的值是:" + s);
            collector.collect(s);
        }

        @Override
        public void close() throws Exception {
            //结束第2次处理....
        }
    }

    static class FlinkBolt3 extends ProcessFunction {

        @Override
        public void open(Configuration parameters) {
            //开始第3次处理....
        }

        @Override
        public void processElement(Object s, ProcessFunction.Context context, Collector collector) throws Exception {
            s = s.toString();
            System.out.println("第3次处理前的值是:" + s);
            s += "ghi";
            System.out.println("第3次处理后的值是:" + s);
            collector.collect(s);
        }

        @Override
        public void close() throws Exception {
            //结束第3次处理....
        }
    }

}
 

处理结果:

转载请注明:文章转载自 www.mshxw.com
我们一直用心在做
关于我们 文章归档 网站地图 联系我们

版权所有 (c)2021-2022 MSHXW.COM

ICP备案号:晋ICP备2021003244-6号