第五章 MyEclipse配置hadoop开发环境
?
1.首先要下載相應(yīng)的hadoop版本的插件,我這里就給2個(gè)例子:
hadoop-1.2.1插件:http://download.csdn.net/download/hanyongan300/6238153
hadoop2.2.0插件:http://blog.csdn.net/twlkyao/article/details/17334693
上一章我也講了怎么制作相應(yīng)版本的插件,這些插件可以在網(wǎng)上搜到
2.把插件拷貝到myeclipse根目錄下/dropins目錄下。
3、 啟動(dòng)myeclipse,打開(kāi)Perspective:
?
【W(wǎng)indow】->【Open?Perspective】->【Other...】->【Map/Reduce】->【OK】
?
?
?
?
3、 打開(kāi)一個(gè)View:
?
【W(wǎng)indow】->【Show?View】->【Other...】->【MapReduce?Tools】->【Map/Reduce?Locations】->【OK】
?
?
4、 添加Hadoop?location:
?
?
location name: 我填寫(xiě)的是:hadoop,這個(gè)可以任意填寫(xiě)
Map/Reduce Master 這個(gè)框里:
這兩個(gè)參數(shù)就是mapred-site.xml里面mapred.job.tracker里面的ip和port??
Host:就是jobtracker 所在的集群機(jī)器,這里寫(xiě)192.168.0.155,我這個(gè)myeclipse是在windows下的,所以去連接liunx下的hadoop要寫(xiě)地址
Hort:就是jobtracker 的port,這里寫(xiě)的是9001
DFS Master 這個(gè)框里
這兩個(gè)參數(shù)就是core-site.xml里面fs.default.name里面的ip和port
Host:就是namenode所在的集群機(jī)器,這里寫(xiě)192.168.0.155
Port:就是namenode的port,這里寫(xiě)9000
(Use M/R master host,這個(gè)復(fù)選框如果選上,就默認(rèn)和Map/Reduce Master這個(gè)框里的host一樣,如果不選擇,就可以自己定義輸入,這里jobtracker 和namenode在一個(gè)機(jī)器上,所以是一樣的,就勾選上)
?user name:這個(gè)是連接hadoop的用戶(hù)名,我創(chuàng)建的用戶(hù)就是hadoop。
然后繼續(xù)填寫(xiě)advanced parameters
在這里只需要填寫(xiě)? hadoop.tmp.dir這一欄,跟在core-site.xml里面配置寫(xiě)的一樣
?
然后關(guān)閉myeclipse重啟,就可以看到連接成功了。。。
?
連接DFS,是為了幫助你查看hdfs目錄的,在這里,你可以直接操作hdfs.點(diǎn)擊右鍵:
create new directory:創(chuàng)建文件
Refresh:刷新
upload files to DFS:上傳文本
upload directory to DFS:上傳文件夾
?
如果要是創(chuàng)建hadoop項(xiàng)目,先配置下:Window------preferences----Hadoop Map/Reduce,指定本地的hadoop
?
然后在本地配置遠(yuǎn)程hadoop的IP,打開(kāi)C:\Windows\System32\drivers\etc\hosts文件,添加hadoop所安裝的服務(wù)器IP和主機(jī)名
下面就可以在myeclipse創(chuàng)建mapreduce項(xiàng)目了,file----new Prokect-----
?
下面就正常寫(xiě)代碼:下面是操作hase的代碼
package hbase;import java.io.File; import java.io.IOException; import java.util.ArrayList; import java.util.List;import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.hbase.HBaseConfiguration; import org.apache.hadoop.hbase.HColumnDescriptor; import org.apache.hadoop.hbase.HTableDescriptor; import org.apache.hadoop.hbase.KeyValue; import org.apache.hadoop.hbase.client.Delete; import org.apache.hadoop.hbase.client.Get; import org.apache.hadoop.hbase.client.HBaseAdmin; import org.apache.hadoop.hbase.client.HTable; import org.apache.hadoop.hbase.client.Put; import org.apache.hadoop.hbase.client.Result; import org.apache.hadoop.hbase.client.ResultScanner; import org.apache.hadoop.hbase.client.Scan; import org.apache.hadoop.hbase.filter.CompareFilter.CompareOp; import org.apache.hadoop.hbase.filter.SingleColumnValueFilter; import org.apache.hadoop.hbase.util.Bytes;/*** 對(duì)hbase進(jìn)行增刪改查* @author Administrator**/ public class HbaseTest {private static Configuration conf=null;static{conf=HBaseConfiguration.create();conf.set("hbase.zookeeper.property.clientPort", "2181");conf.set("hbase.zookeeper.quorum", "192.168.0.26");//configuration.set("hbase.master", "192.168.1.25:6000");File workaround = new File(".");System.getProperties().put("hadoop.home.dir",workaround.getAbsolutePath());new File("./bin").mkdirs();try {new File("./bin/winutils.exe").createNewFile();} catch (IOException e) {// TODO Auto-generated catch block e.printStackTrace();}}/*** 創(chuàng)建一張表* @param tablename* @param cfg* @throws IOException*/public static void createTable(String tablename,String[] cfg)throws IOException{HBaseAdmin admin=new HBaseAdmin(conf);if(admin.tableExists(tablename)){System.out.println("表已經(jīng)存在!");}else{HTableDescriptor tableDesc=new HTableDescriptor(tablename);for(int i=0;i<cfg.length;i++){tableDesc.addFamily(new HColumnDescriptor(cfg[i]));}admin.createTable(tableDesc);System.out.println("表創(chuàng)建成功!");}}/*** 刪除表* @param tablename* @throws IOException*/public static void deleteTable(String tablename)throws IOException{HBaseAdmin admin=new HBaseAdmin(conf);admin.disableTable(tablename);admin.deleteTable(tablename);System.out.println("表刪除成功"); }/*** 插入一行記錄* @param tablename* @param rowkey* @param family* @param qualifier* @param value*/public static void writeRow(String tablename,String rowkey,String family,String qualifier,String value) {try {HTable table=new HTable(conf,tablename);Put put =new Put(Bytes.toBytes(rowkey));put.add(Bytes.toBytes(family),Bytes.toBytes(qualifier),Bytes.toBytes(value));table.put(put);System.out.println("插入數(shù)據(jù)成功");} catch (IOException e) {// TODO Auto-generated catch block e.printStackTrace();}}/*** 查找開(kāi)始與結(jié)束之間的數(shù)據(jù)* @param tableName*/public static void getAll(String tableName) { try{ HTable table = new HTable(conf, tableName); Scan scan = new Scan(Bytes.toBytes("a"), Bytes.toBytes("z")); // scan.addColumn(Bytes.toBytes("a"), Bytes.toBytes("z")); // SingleColumnValueFilter filter = new SingleColumnValueFilter(Bytes.toBytes("a"), // Bytes.toBytes("z"), // CompareOp.NOT_EQUAL, Bytes.toBytes("0")); // filter.setFilterIfMissing(true); // scan.setFilter(filter); ResultScanner ss = table.getScanner(scan); for(Result r:ss){ for(KeyValue kv : r.raw()){ System.out.print("rowid為:"+new String(kv.getRow()) + " "); System.out.print("列族為:"+new String(kv.getFamily()) + " "); System.out.print("列限定符為:"+new String(kv.getQualifier()) + " "); System.out.print("時(shí)間戳"+kv.getTimestamp() + " "); System.out.println("值為"+new String(kv.getValue())); } } } catch (IOException e){ e.printStackTrace(); } } /*** 查找一行記錄* @param tablename* @param rowKey* @throws IOException*/@SuppressWarnings("deprecation")public static void getOne(String tablename,String rowKey) throws IOException{HTable table = new HTable(conf, tablename); Get get = new Get(rowKey.getBytes()); Result rs = table.get(get); for(KeyValue kv : rs.raw()){ System.out.print(new String(kv.getRow()) + " " ); System.out.print(new String(kv.getFamily()) + ":" ); System.out.print(new String(kv.getQualifier()) + " " ); System.out.print(kv.getTimestamp() + " " ); System.out.println(new String(kv.getValue())); } }/** * 顯示所有數(shù)據(jù) */ public static void getAllRecord (String tableName) { try{ HTable table = new HTable(conf, tableName); Scan s = new Scan(); ResultScanner ss = table.getScanner(s); for(Result r:ss){ for(KeyValue kv : r.raw()){ System.out.print("rowid為:"+new String(kv.getRow()) + " "); System.out.print("列族為:"+new String(kv.getFamily()) + " "); System.out.print("列限定符為:"+new String(kv.getQualifier()) + " "); System.out.print("時(shí)間戳"+kv.getTimestamp() + " "); System.out.println("值為"+new String(kv.getValue())); } } } catch (IOException e){ e.printStackTrace(); } } /** * 刪除一行記錄 */ public static void delRecord (String tableName, String rowKey) throws IOException{ HTable table = new HTable(conf, tableName); List list = new ArrayList(); Delete del = new Delete(rowKey.getBytes()); list.add(del); table.delete(list); System.out.println("del recored " + rowKey + " ok."); } public static void main(String[] agrs){String tablename="score";String[] familys={"grade","course"};try {HbaseTest.createTable(tablename, familys);//HbaseTest.delvalue(tablename, "zkb","course" , "click", "90"); // HbaseTest.delRecord(tablename, "zkb"); } catch (IOException e) {// TODO Auto-generated catch block e.printStackTrace();}// HbaseTest.writeRow(tablename, "zkb", "grade", "title", "5"); // HbaseTest.writeRow(tablename, "zkb", "course", "click", "90"); // HbaseTest.writeRow(tablename, "zkb", "course", "url", "97"); // HbaseTest.writeRow(tablename, "zkb", "course", "author", "87"); // // HbaseTest.writeRow(tablename, "baoniu", "grade", "reply", "4"); // HbaseTest.writeRow(tablename, "baoniu", "course", "siteName", "89"); // HbaseTest.writeRow(tablename, "1", "grade", "title", "5"); // HbaseTest.writeRow(tablename, "1", "course", "click", "90"); // HbaseTest.writeRow(tablename, "2", "course", "url", "97"); // HbaseTest.writeRow(tablename, "2", "course", "author", "87"); // // HbaseTest.writeRow(tablename, "3", "grade", "reply", "4"); // HbaseTest.writeRow(tablename, "3", "course", "siteName", "89"); // HbaseTest.getOne(tablename, "zkb"); // HbaseTest.getAllRecord(tablename); // HbaseTest.getAllRecord(tablename); HbaseTest.getAll(tablename);}}?
?
?
?
?
?
?
?
?
轉(zhuǎn)載于:https://www.cnblogs.com/junrong624/p/3499847.html
總結(jié)
以上是生活随笔為你收集整理的第五章 MyEclipse配置hadoop开发环境的全部?jī)?nèi)容,希望文章能夠幫你解決所遇到的問(wèn)題。
- 上一篇: [转载]JDBC/Spring/MyBa
- 下一篇: SRM 440(1-250pt, 1-5