亚洲在线久爱草,狠狠天天香蕉网,天天搞日日干久草,伊人亚洲日本欧美

為了賬號安全,請及時綁定郵箱和手機立即綁定

老師能幫忙看下這個是怎么回事嗎,謝謝

http://img1.sycdn.imooc.com//5dea61f30001afef10510741.jpghttp://img1.sycdn.imooc.com//5dea621f0001217409860750.jpg

sudo /opt/hadoop/hadoop-2.10.0/bin/hadoop jar /opt/hadoop/hadoop-2.10.0/share/hadoop/tools/lib/hadoop-streaming-2.10.0.jar? -files "hdfs_map.py,hdfs_reduce.py" -input /input/student.txt -output /tmp/wordcounttest -mapper "/root/anaconda3/bin/python hdfs_map.py" -reducer "/root/anaconda3/bin/python hdfs_reduce.py"sudo /opt/hadoop/hadoop-2.10.0/bin/hadoop jar /opt/hadoop/hadoop-2.10.0/share/hadoop/tools/lib/hadoop-streaming-2.10.0.jar? -files "hdfs_map.py,hdfs_reduce.py" -input /input/student.txt -output /tmp/wordcounttest -mapper "/root/anaconda3/bin/python hdfs_map.py" -reducer "/root/anaconda3/bin/python hdfs_reduce.py"


#!/opt/anaconda3/bin/python
#-*-?coding:utf-8?-*-
import?sys
def?read_input(file):
for?line?in?file:
yield?line.split()
def?main():
data=read_input(sys.stdin)
for?words?in?data:
for?word?in?words:
print("%s%s%d"?%?(word,'\t',1))
if?__name__=='__main__':
main()


#!/opt/anaconda3/bin/python
#?-*-?coding:utf-8?-*-
import?sys?
from?operatorimportitemgetter
from?itertoolsimport?groupby
def?read_mapper_output(file,separator='\t'):
for?line?in?file:
yieldline.rstrip().split(separator,1)
def?main():
data=read_mapper_output(sys.stdin)
for?current_word,group?in?groupby(data,itemgetter(0)):
total_count=sum(int(count)?for?current_word,count?in?group)
print("%s%s%d"%(current_word,'\t',total_count))
if?__name__=='__main__':
main()


正在回答

1 回答

你這不是一下執行了重復的命令嗎? 而且錯誤詳情已經提示你了? -files不支持? ? 支持-file

0 回復 有任何疑惑可以回復我~

舉報

0/150
提交
取消

老師能幫忙看下這個是怎么回事嗎,謝謝

我要回答 關注問題
微信客服

購課補貼
聯系客服咨詢優惠詳情

幫助反饋 APP下載

慕課網APP
您的移動學習伙伴

公眾號

掃描二維碼
關注慕課網微信公眾號