栏目分类:
子分类:
返回
名师互学网用户登录
快速导航关闭
当前搜索
当前分类
子分类
实用工具
热门搜索
名师互学网 > IT > 软件开发 > 后端开发 > Java

【Hadoop】MapReduce案例——好友推荐度案例

Java 更新时间: 发布时间: IT归档 最新发布 模块sitemap 名妆网 法律咨询 聚返吧 英语巴士网 伯小乐 网商动力

【Hadoop】MapReduce案例——好友推荐度案例

文章目录
  • 一、前期准备
  • 二、数据准备
  • 三、好友推荐度案例

一、前期准备

可参考 “词频统计” 案例中的前期准备阶段

二、数据准备

生成好友关系数据,上传至hdfs

package com.hdtrain;

import com.google.inject.internal.util.$FinalizableReference;

import java.util.HashMap;
import java.util.Map;

public class HelloFriend {
    public static void main(String[] args) {
        Map friendMap = new HashMap<>();
        for (int i=100;i<=200;i++){
            friendMap.put(i, "");
            for (int j=1;j<=4;j++){
                int friend = (int) (Math.random() * 21 + 200 + 20 * j);
                if(!friendMap.containsKey(friend)) {
                    friendMap.put(friend, "");
                }
                friendMap.put(i, friendMap.get(i) + " " + friend);
                friendMap.put(friend, friendMap.get(friend) + " " + i);
            }
            for (Integer key: friendMap.keySet()){
                System.out.println(key + "----" + friendMap.get(key));
            }
        }
    }
}
三、好友推荐度案例

1.FriendJob.class

package com.hdtrain;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;

import java.io.IOException;

public class FriendJob {
    public static void main(String[] args) throws IOException, ClassNotFoundException, InterruptedException {
        Configuration configuration = new Configuration(true);
        configuration.set("mapreduce.framework.name", "local");

        Job job = Job.getInstance(configuration);
        job.setJobName("Friend--" + System.currentTimeMillis());
        job.setJarByClass(FriendJob.class);
        job.setNumReduceTasks(2);

        FileInputFormat.setInputPaths(job, new Path("/data/friend.txt"));
        FileOutputFormat.setOutputPath(job, new Path("/results/Friend-" + System.currentTimeMillis()));

        job.setMapOutputKeyClass(Text.class);
        job.setMapOutputValueClass(IntWritable.class);

        job.setMapperClass(FriendMapper.class);
        job.setReducerClass(FriendReducer.class);

        job.waitForCompletion(true);
    }
}

2.FriendMapper.class

package com.hdtrain;


import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Mapper;

import java.io.IOException;
import java.io.StringReader;

public class FriendMapper extends Mapper {
    @Override
    protected void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException {
        //开始处理数据
        String[] friends = value.toString().replaceAll("----","").split(" ");
        String person = friends[0];
        if (friends!=null && friends.length>0){
            for (int i=1;i
                //直接朋友关系
                context.write(new Text(friendorder(person, friends[i])), new IntWritable(-1));
                for (int j=i+1;j
                    context.write(new Text(friendorder(friends[i], friends[j])), new IntWritable(1));
                }
            }
        }
    }
    private String friendorder(String personA, String personB){
        return personA.compareTo(personB)>0 ? (personB + " " + personA):(personA + " " + personB);
    }
}

3.FriendReducer.class

package com.hdtrain;

import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Reducer;

import java.io.IOException;
import java.util.Iterator;

public class FriendReducer extends Reducer {
    @Override
    protected void reduce(Text key, Iterable values, Context context) throws IOException, InterruptedException {
        long count = 0;
        Iterator iterator = values.iterator();
        while(iterator.hasNext()){
            int value = iterator.next().get();
            if(value == -1){
                return ;
            }else{
                count += value;
            }
        }
        context.write(key, new LongWritable(count));
    }
}

4.计算结果

转载请注明:文章转载自 www.mshxw.com
本文地址:https://www.mshxw.com/it/826336.html
我们一直用心在做
关于我们 文章归档 网站地图 联系我们

版权所有 (c)2021-2022 MSHXW.COM

ICP备案号:晋ICP备2021003244-6号