日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 编程资源 > 编程问答 >内容正文

编程问答

Scala模拟Spark分布式计算流程示例代码

發布時間:2025/1/21 编程问答 28 豆豆
生活随笔 收集整理的這篇文章主要介紹了 Scala模拟Spark分布式计算流程示例代码 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

場景

兩個Executor,分別接收來自Driver分發的任務(數據和計算邏輯)

代碼

Executor1

package com.zxl.bigdata.spark.core.testimport java.io.{InputStream, ObjectInputStream} import java.net.{ServerSocket, Socket}object Executor {def main(args: Array[String]): Unit = {// 啟動服務器,接收數據val server = new ServerSocket(9999)println("服務器啟動,等待接收數據")// 等待客戶端的連接val client: Socket = server.accept()val in: InputStream = client.getInputStreamval objIn = new ObjectInputStream(in)val task: SubTask = objIn.readObject().asInstanceOf[SubTask]val ints: List[Int] = task.compute()println("計算節點[9999]計算的結果為:" + ints)objIn.close()client.close()server.close()} }

Executor2

package com.zxl.bigdata.spark.core.testimport java.io.{InputStream, ObjectInputStream} import java.net.{ServerSocket, Socket}object Executor2 {def main(args: Array[String]): Unit = {// 啟動服務器,接收數據val server = new ServerSocket(8888)println("服務器啟動,等待接收數據")// 等待客戶端的連接val client: Socket = server.accept()val in: InputStream = client.getInputStreamval objIn = new ObjectInputStream(in)val task: SubTask = objIn.readObject().asInstanceOf[SubTask]val ints: List[Int] = task.compute()println("計算節點[8888]計算的結果為:" + ints)objIn.close()client.close()server.close()} }

Task

package com.zxl.bigdata.spark.core.testclass Task extends Serializable {val datas = List(1,2,3,4)//val logic = ( num:Int )=>{ num * 2 }val logic : (Int)=>Int = _ * 2}

SubTask

package com.zxl.bigdata.spark.core.testclass SubTask extends Serializable {var datas : List[Int] = _var logic : (Int)=>Int = _// 計算def compute() = {datas.map(logic)} }

Driver

package com.zxl.bigdata.spark.core.testimport java.io.{ObjectOutputStream, OutputStream} import java.net.Socketobject Driver {def main(args: Array[String]): Unit = {// 連接服務器val client1 = new Socket("localhost", 9999)val client2 = new Socket("localhost", 8888)val task = new Task()val out1: OutputStream = client1.getOutputStreamval objOut1 = new ObjectOutputStream(out1)val subTask = new SubTask()subTask.logic = task.logicsubTask.datas = task.datas.take(2)objOut1.writeObject(subTask)objOut1.flush()objOut1.close()client1.close()val out2: OutputStream = client2.getOutputStreamval objOut2 = new ObjectOutputStream(out2)val subTask1 = new SubTask()subTask1.logic = task.logicsubTask1.datas = task.datas.takeRight(2)objOut2.writeObject(subTask1)objOut2.flush()objOut2.close()client2.close()println("客戶端數據發送完畢")} }

程序運行日志


總結

以上是生活随笔為你收集整理的Scala模拟Spark分布式计算流程示例代码的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。