使用java的HttpClient实现抓取网页数据
生活随笔
收集整理的這篇文章主要介紹了
使用java的HttpClient实现抓取网页数据
小編覺得挺不錯(cuò)的,現(xiàn)在分享給大家,幫大家做個(gè)參考.
網(wǎng)絡(luò)爬蟲就是用程序幫助我們?cè)L問網(wǎng)絡(luò)上的資源,我們一直以來都是使用HTTP協(xié)議來訪問互聯(lián)網(wǎng)上的網(wǎng)頁(yè),網(wǎng)絡(luò)爬蟲需要編寫程序,在這里使用同樣的HTTP協(xié)議來訪問網(wǎng)頁(yè)。
1.pom依賴
? ? ? ?<dependency><groupId>org.apache.httpcomponents</groupId><artifactId>httpclient</artifactId><version>4.5.2</version></dependency><dependency><groupId>org.slf4j</groupId><artifactId>slf4j-log4j12</artifactId><version>1.7.25</version></dependency>2.log4j的配置文件
log4j.properties
log4j.rootLogger=DEBUG,A1 log4j.logger.com.yfy = DEBUG ? log4j.appender.A1=org.apache.log4j.ConsoleAppender log4j.appender.A1.layout=org.apache.log4j.PatternLayout log4j.appender.A1.layout.ConversionPattern=%-d{yyyy-MM-dd HH:mm:ss,SSS} [%t] [%c]-[%p] %m%n3.GET請(qǐng)求
public class HttpGetTest {public static void main(String[] args) throws URISyntaxException {//1.創(chuàng)建HttpClient對(duì)象CloseableHttpClient httpClient = HttpClients.createDefault(); ?//設(shè)置請(qǐng)求地址是:http://yun.itheima.com/search?keys=Java//創(chuàng)建URIBuilderURIBuilder uriBuilder = new URIBuilder("https://movie.douban.com/top250");uriBuilder.setParameter("start", "25"); ?//2.創(chuàng)建HttpGet對(duì)象,設(shè)置url訪問地址HttpGet httpGet = new HttpGet(uriBuilder.build()); ?//配置請(qǐng)求信息//有時(shí)候因?yàn)榫W(wǎng)絡(luò),獲取目標(biāo)服務(wù)器的原因,請(qǐng)求需要更長(zhǎng)的時(shí)間才能完成,我們需要自定義相關(guān)時(shí)間RequestConfig config = RequestConfig.custom().setConnectTimeout(1000) //創(chuàng)建連接的最長(zhǎng)時(shí)間,單位是毫秒.setConnectionRequestTimeout(500)//設(shè)置獲取連接的最長(zhǎng)時(shí)間.setSocketTimeout(10 * 1000) //設(shè)置數(shù)據(jù)傳輸?shù)淖铋L(zhǎng)時(shí)間.build(); ?System.out.println("發(fā)起請(qǐng)求的信息:" + httpGet);//3.使用HttpClient發(fā)起請(qǐng)求,獲取responseCloseableHttpResponse response = null;try {response = httpClient.execute(httpGet);//4.解析響應(yīng)if (response.getStatusLine().getStatusCode() == 200) {String content = EntityUtils.toString(response.getEntity(), "utf8");System.out.println(content);System.out.println(content.length());}} catch (IOException e) {e.printStackTrace();} finally {try {response.close();} catch (IOException e) {e.printStackTrace();}try {httpClient.close();} catch (IOException e) {e.printStackTrace();}} ?} }4.POST請(qǐng)求
public class HttpParamTest {public static void main(String[] args) throws UnsupportedEncodingException {//1.創(chuàng)建HttpClient對(duì)象CloseableHttpClient httpClient = HttpClients.createDefault(); ?//2.創(chuàng)建HttpPost對(duì)象,設(shè)置url訪問地址HttpPost httpPost = new HttpPost("http://yun.itheima.com/search"); ?System.out.println("發(fā)起請(qǐng)求的信息:" + httpPost); ?//聲明List集合,封裝表單中的餐胡List<NameValuePair> params = new ArrayList<>();params.add(new BasicNameValuePair("keys", "Java")); ?//創(chuàng)建表單的Entity對(duì)象UrlEncodedFormEntity formEntity = new UrlEncodedFormEntity(params, "utf8");httpPost.setEntity(formEntity); ?//3.使用HttpClient發(fā)起請(qǐng)求,獲取responseCloseableHttpResponse response = null;try {response = httpClient.execute(httpPost);//4.解析響應(yīng)if (response.getStatusLine().getStatusCode() == 200) {String content = EntityUtils.toString(response.getEntity(), "utf8");System.out.println(content.length());}} catch (IOException e) {e.printStackTrace();} finally {try {response.close();} catch (IOException e) {e.printStackTrace();}try {httpClient.close();} catch (IOException e) {e.printStackTrace();}}} }5.連接池
如果每次請(qǐng)求都創(chuàng)建HttpClient,會(huì)有頻繁創(chuàng)建和銷毀的問題,可以使用連接池來解決這個(gè)問題
public class HttpClientPoolTest {public static void main(String[] args) {//創(chuàng)建連接池管理器PoolingHttpClientConnectionManager cm = new PoolingHttpClientConnectionManager();//設(shè)置最大連接數(shù)cm.setMaxTotal(100);//設(shè)置每個(gè)主機(jī)的最大連接數(shù)cm.setDefaultMaxPerRoute(10);//使用連接池管理器發(fā)起請(qǐng)求doGet(cm);doGet(cm);} ?private static void doGet(PoolingHttpClientConnectionManager cm) {//不是每次創(chuàng)建新的HttpClient,而是從連接池中獲取HttpClientCloseableHttpClient httpClient = HttpClients.custom().setConnectionManager(cm).build(); ?HttpGet httpGet = new HttpGet("https://movie.douban.com/top250");CloseableHttpResponse response = null;try {response = httpClient.execute(httpGet);if (response.getStatusLine().getStatusCode() == 200) {String content = EntityUtils.toString(response.getEntity(), "utf8");System.out.println(content.length());}} catch (IOException e) {e.printStackTrace();} finally {if (response != null) {try {response.close();} catch (IOException e) {e.printStackTrace();}}//不能關(guān)閉HttpClient,由連接池管理}} }?
總結(jié)
以上是生活随笔為你收集整理的使用java的HttpClient实现抓取网页数据的全部?jī)?nèi)容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: Docker私有仓库的搭建
- 下一篇: 使用jsoup解析html