2 回答

TA貢獻1803條經驗 獲得超3個贊
是的,可以使用 Spark 監聽 TCP 端口并處理任何傳入數據。您正在尋找的是Spark Streaming。
為了方便:
import org.apache.spark.*;
import org.apache.spark.api.java.function.*;
import org.apache.spark.streaming.*;
import org.apache.spark.streaming.api.java.*;
import scala.Tuple2;
// Create a local StreamingContext with two working thread and batch interval of 1 second
SparkConf conf = new SparkConf().setMaster("local[2]").setAppName("NetworkWordCount");
JavaStreamingContext jssc = new JavaStreamingContext(conf, Durations.seconds(1));
// Create a DStream that will connect to hostname:port, like localhost:9999
JavaReceiverInputDStream<String> lines = jssc.socketTextStream("localhost", 9999);
// Split each line into words
JavaDStream<String> words = lines.flatMap(x -> Arrays.asList(x.split(" ")).iterator());
// Count each word in each batch
JavaPairDStream<String, Integer> pairs = words.mapToPair(s -> new Tuple2<>(s, 1));
JavaPairDStream<String, Integer> wordCounts = pairs.reduceByKey((i1, i2) -> i1 + i2);
// Print the first ten elements of each RDD generated in this DStream to the console
wordCounts.print();
jssc.start();? ? ? ? ? ? ? // Start the computation
jssc.awaitTermination();? ?// Wait for the computation to terminate

TA貢獻1813條經驗 獲得超2個贊
Spark沒有內置的TCP服務器來等待生產者和緩沖數據。Spark 通過其 API 庫在 TCP、Kafka 等的輪詢機制上工作。要使用傳入的 TCP 數據,您需要有一個 Spark 可以連接到的外部 TCP 服務器,如 Shaido 在示例中所解釋的那樣。
添加回答
舉報