日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 运维知识 > 数据库 >内容正文

数据库

Spark查找某个IP的归属地,二分算法,try{}catch{}的使用,将结果存MySQL数据库

發布時間:2024/9/27 数据库 23 豆豆
生活随笔 收集整理的這篇文章主要介紹了 Spark查找某个IP的归属地,二分算法,try{}catch{}的使用,将结果存MySQL数据库 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

1、創建Maven工程

調整Maven倉庫所在的位置,具體參考:http://blog.csdn.net/tototuzuoquan/article/details/74571374

2、編寫Pom文件

<?xml version="1.0" encoding="UTF-8"?> <project xmlns="http://maven.apache.org/POM/4.0.0"xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"><modelVersion>4.0.0</modelVersion><groupId>cn.toto.spark</groupId><artifactId>bigdata</artifactId><version>1.0-SNAPSHOT</version><properties><maven.compiler.source>1.7</maven.compiler.source><maven.compiler.target>1.7</maven.compiler.target><encoding>UTF-8</encoding><scala.version>2.10.6</scala.version><spark.version>1.6.2</spark.version><hadoop.version>2.6.4</hadoop.version></properties><dependencies><dependency><groupId>org.scala-lang</groupId><artifactId>scala-library</artifactId><version>${scala.version}</version></dependency><dependency><groupId>org.apache.spark</groupId><artifactId>spark-core_2.10</artifactId><version>${spark.version}</version></dependency><dependency><groupId>org.apache.hadoop</groupId><artifactId>hadoop-client</artifactId><version>${hadoop.version}</version></dependency><dependency><groupId>mysql</groupId><artifactId>mysql-connector-java</artifactId><version>5.1.38</version></dependency></dependencies><build><sourceDirectory>src/main/scala</sourceDirectory><testSourceDirectory>src/test/scala</testSourceDirectory><plugins><plugin><groupId>net.alchim31.maven</groupId><artifactId>scala-maven-plugin</artifactId><version>3.2.2</version><executions><execution><goals><goal>compile</goal><goal>testCompile</goal></goals><configuration><args><arg>-make:transitive</arg><arg>-dependencyfile</arg><arg>${project.build.directory}/.scala_dependencies</arg></args></configuration></execution></executions></plugin><plugin><groupId>org.apache.maven.plugins</groupId><artifactId>maven-shade-plugin</artifactId><version>2.4.3</version><executions><execution><phase>package</phase><goals><goal>shade</goal></goals><configuration><filters><filter><artifact>*:*</artifact><excludes><exclude>META-INF/*.SF</exclude><exclude>META-INF/*.DSA</exclude><exclude>META-INF/*.RSA</exclude></excludes></filter></filters></configuration></execution></executions></plugin></plugins></build></project>

3、準備要處理的文件

其中ip信息的文件(ip.txt)如下:

1.0.1.0|1.0.3.255|16777472|16778239|亞洲|中國|福建|福州||電信|350100|China|CN|119.306239|26.075302 1.0.8.0|1.0.15.255|16779264|16781311|亞洲|中國|廣東|廣州||電信|440100|China|CN|113.280637|23.125178 1.0.32.0|1.0.63.255|16785408|16793599|亞洲|中國|廣東|廣州||電信|440100|China|CN|113.280637|23.125178 1.1.0.0|1.1.0.255|16842752|16843007|亞洲|中國|福建|福州||電信|350100|China|CN|119.306239|26.075302 1.1.2.0|1.1.7.255|16843264|16844799|亞洲|中國|福建|福州||電信|350100|China|CN|119.306239|26.075302 1.1.8.0|1.1.63.255|16844800|16859135|亞洲|中國|廣東|廣州||電信|440100|China|CN|113.280637|23.125178 1.2.0.0|1.2.1.255|16908288|16908799|亞洲|中國|福建|福州||電信|350100|China|CN|119.306239|26.075302

數據訪問文件(access.log)如下:**

20090121000132095572000|125.213.100.123|show.51.com|/shoplist.php?phpfile=shoplist2.php&style=1&sex=137|Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; Mozilla/4.0(Compatible Mozilla/4.0(Compatible-EmbeddedWB 14.59 http://bsalsa.com/ EmbeddedWB- 14.59 from: http://bsalsa.com/ )|http://show.51.com/main.php| 20090121000132124542000|117.101.215.133|www.jiayuan.com|/19245971|Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; TencentTraveler 4.0)|http://photo.jiayuan.com/index.php?uidhash=d1c3b69e9b8355a5204474c749fb76ef|__tkist=0; myloc=50%7C5008; myage=2009; PROFILE=14469674%3A%E8%8B%A6%E6%B6%A9%E5%92%96%E5%95%A1%3Am%3Aphotos2.love21cn.com%2F45%2F1b%2F388111afac8195cc5d91ea286cdd%3A1%3A%3Ahttp%3A%2F%2Fimages.love21cn.com%2Fw4%2Fglobal%2Fi%2Fhykj_m.jpg; last_login_time=1232454068; SESSION_HASH=8176b100a84c9a095315f916d7fcbcf10021e3af; RAW_HASH=008a1bc48ff9ebafa3d5b4815edd04e9e7978050; COMMON_HASH=45388111afac8195cc5d91ea286cdd1b; pop_1232093956=1232468896968; pop_time=1232466715734; pop_1232245908=1232469069390; pop_1219903726=1232477601937; LOVESESSID=98b54794575bf547ea4b55e07efa2e9e; main_search:14469674=%7C%7C%7C00; registeruid=14469674; REG_URL_COOKIE=http%3A%2F%2Fphoto.jiayuan.com%2Fshowphoto.php%3Fuid_hash%3D0319bc5e33ba35755c30a9d88aaf46dc%26total%3D6%26p%3D5; click_count=0%2C3363619 20090121000132406516000|117.101.222.68|gg.xiaonei.com|/view.jsp?p=389|Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; CIBA)|http://home.xiaonei.com/Home.do?id=229670724|_r01_=1; __utma=204579609.31669176.1231940225.1232462740.1232467011.145; __utmz=204579609.1231940225.1.1.utmccn=(direct) 20090121000132581311000|115.120.36.118|tj.tt98.com|/tj.htm|Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; TheWorld)|http://www.tt98.com/|

4.獲取ip歸屬地信息

package cn.toto.sparkimport java.io.{BufferedReader, FileInputStream, InputStreamReader}import scala.collection.mutable.ArrayBuffer/*** Created by toto on 2017/7/8.* 查找IP的歸屬地信息*/ object IPLocationDemo {def ip2Long(ip: String): Long = {val fragments = ip.split("[.]")var ipNum = 0Lfor (i <- 0 until fragments.length){ipNum = fragments(i).toLong | ipNum << 8L}ipNum}def readData(path: String) = {val br = new BufferedReader(new InputStreamReader(new FileInputStream(path)))var s: String = nullvar flag = trueval lines = new ArrayBuffer[String]()while (flag){s = br.readLine()if (s != null)lines += selseflag = false}lines}def binarySearch(lines: ArrayBuffer[String], ip: Long) : Int = {var low = 0var high = lines.length - 1while (low <= high) {val middle = (low + high) / 2if ((ip >= lines(middle).split("\\|")(2).toLong) && (ip <= lines(middle).split("\\|")(3).toLong))return middleif (ip < lines(middle).split("\\|")(2).toLong)high = middle - 1else {low = middle + 1}}-1}/*** 運行后的結果是:* 2016917821* 120.55.0.0|120.55.255.255|2016870400|2016935935|亞洲|中國|浙江|杭州||阿里巴巴|330100|China|CN|120.153576|30.287459** 要求2016917821 在 |2016870400|2016935935| 之間。* @param args*/def main(args: Array[String]): Unit = {val ip = "120.55.185.61"val ipNum = ip2Long(ip)println(ipNum)val lines = readData("E:\\learnTempFolder\\ip.txt")val index = binarySearch(lines, ipNum)print(lines(index))} }

運行結果:


5.查詢IP歸屬地相關信息,并將這些信息存儲到MySQL數據庫中

代碼如下:

package cn.toto.sparkimport java.sql.{Connection, Date, DriverManager, PreparedStatement}import org.apache.spark.{SparkConf, SparkContext}/*** Created by toto on 2017/7/8.*/ object IPLocation {val data2MySQL = (iterator: Iterator[(String, Int)]) => {var conn: Connection = nullvar ps : PreparedStatement = nullval sql = "INSERT INTO location_info (location, counts, accesse_date) VALUES (?, ?, ?)"try {conn = DriverManager.getConnection("jdbc:mysql://192.168.106.100:3306/bigdata", "root", "123456")iterator.foreach(line => {ps = conn.prepareStatement(sql)ps.setString(1, line._1)ps.setInt(2, line._2)ps.setDate(3, new Date(System.currentTimeMillis()))ps.executeUpdate()})} catch {case e: Exception => println("Mysql Exception")} finally {if (ps != null)ps.close()if (conn != null)conn.close()}}def ip2Long(ip: String): Long = {val fragments = ip.split("[.]")var ipNum = 0Lfor (i <- 0 until fragments.length){ipNum = fragments(i).toLong | ipNum << 8L}ipNum}def binarySearch(lines: Array[(String, String, String)], ip: Long) : Int = {var low = 0var high = lines.length - 1while (low <= high) {val middle = (low + high) / 2if ((ip >= lines(middle)._1.toLong) && (ip <= lines(middle)._2.toLong))return middleif (ip < lines(middle)._1.toLong)high = middle - 1else {low = middle + 1}}-1}def main(args: Array[String]): Unit = {val conf = new SparkConf().setMaster("local[2]").setAppName("IpLocation")val sc = new SparkContext(conf)val ipRulesRdd = sc.textFile("E://workspace//ip.txt").map(line =>{val fields = line.split("\\|")val start_num = fields(2)val end_num = fields(3)val province = fields(6)(start_num, end_num, province)})//全部的ip映射規則val ipRulesArrary = ipRulesRdd.collect()//廣播規則val ipRulesBroadcast = sc.broadcast(ipRulesArrary)//加載要處理的數據val ipsRDD = sc.textFile("E://workspace//access.log").map(line => {val fields = line.split("\\|")fields(1)})val result = ipsRDD.map(ip => {val ipNum = ip2Long(ip)val index = binarySearch(ipRulesBroadcast.value, ipNum)val info = ipRulesBroadcast.value(index)//(ip的起始Num, ip的結束Num,省份名)info}).map(t => (t._3, 1)).reduceByKey(_+_)//向MySQL寫入數據result.foreachPartition(data2MySQL(_))//println(result.collect().toBuffer)sc.stop()} }

數據庫SQL:

CREATE DATABASE bigdata CHARACTER SET utf8;USE bigdata;CREATE TABLE location_info (id INT(10) AUTO_INCREMENT PRIMARY KEY,location VARCHAR(100),counts INT(10),accesse_date DATE ) ENGINE=INNODB DEFAULT CHARSET=utf8;

運行程序,運行結果后:

總結

以上是生活随笔為你收集整理的Spark查找某个IP的归属地,二分算法,try{}catch{}的使用,将结果存MySQL数据库的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。