JAVA爬虫初识之模拟登录

时间:2022-05-14 09:10:28

在设计一个爬虫的时候,在第一步对网站的大概浏览了解情况是会发现有些网站在访问之前是需要登录的,否则是无法访问到有我们需要的数据的子页面的,这个时候就要在之前的基础上增加一个模拟登录的步骤。
其实模拟登录的步骤跟之前所说的httpclient基本是一样的,只不过现在网站登录基本用的是post方法,同时在里面携带登录所需要的参数如账号密码,所以我们只需要模拟实际操作,将待爬取网站所需要的参数对应的设置到httppost中,下面以模拟知乎登录为例:

1. 确定登录所需要携带的参数:

首先确定登录所需要携带的参数,这里仍用到之前提过的抓包工具fiddler,通过对登录数据的抓取,最后发现需要携带以下参数:
JAVA爬虫初识之模拟登录
之后分析一下这些参数:
- _xsrf:起初我也对这个参数很疑惑具体是哪来的,但经过对登录页面网页源代码的查找发现这是在每次发起www.zhihu.com请求时在网页源代码中携带返回的一个参数且每次都是不一样的如图
JAVA爬虫初识之模拟登录
- captcha:显而易见,验证码。
- captcha_type:验证码的类型。
- email:账号。
- password:密码。

2.获取参数:

  • _xsrf:对https://www.zhihu.com发起请求,下载登录页面,再直接从页面取值即可。取值代码
/**
* 获取_xsrf
* getPageHtml()是下载页面方法
*/

public String get_xsrf("https://www.zhihu.com") {
String page = getPageHtml("https://www.zhihu.com");
Document doc = Jsoup.parse(page);
Elements srfs = doc.getElementsByAttributeValue("name", "_xsrf");
String xsrf = srfs.first().attr("value");
return xsrf;
}
  • captcha:验证码。起初登录时发现知乎登录用的是中文验证码,需要找出倒写的汉字,发送到服务器的参数是鼠标点击的位置,服务器会根据位置是否与图片倒写汉字位置匹配来判断正确与否。嫌参数分析麻烦(其实是太菜。。)多次实验发现知乎还有输入数字字母的验证码,这个就简单多了而且这种验证码少携带一个captcha_type参数:
    JAVA爬虫初识之模拟登录
    JAVA爬虫初识之模拟登录所以只需要将验证码图片下载到本地在对应的设置参数就行了。首先是下载验证码图片,通过对页面源代码的分析并没有找到图片,那就应该是通过js传过来的,抓包分析果然找到了加载验证码的步骤:
    JAVA爬虫初识之模拟登录知道了来源,那就简单了只需要像_xsrf一样请求相应的地址就行了,然而观察地址又难住了,这个r参数是哪儿来的。。经过多次抓包发这个r每次都不一样而且毫无规律可循,于是猜测这可能是个随机数,通过浏浏览器将这个r改成随机数请求果然得到了一张验证码图片:
    JAVA爬虫初识之模拟登录接下来就是按部就班的下载了;得到了图片第二步就是设置验证码参数。由于技术有限无法做到智能识别输入,所以选择从控制台输入的方式设置验证码参数。。即在首次登录是下载验证码到本地之后人工查看验证码之后控制台输入验证码设置到请求参数中。下载输入代码如下:
/**
* 下载验证码到本地
* @param url
* @param desFileName
* @return
* @throws MalformedURLException
*/

public boolean downloaderCapter(String url,String desFileName) throws MalformedURLException {
boolean flag = false;
String page = getPageHtml(url);
Document doc = Jsoup.parse(page);
Elements capchas = doc.select("img.Captcha-image");
System.out.println(capchas.size());
if (capchas.size()==0) {
System.out.println("不需要验证码");
}else {
String caurl = "";
//生成随机数
Random rnd = new Random();
StringBuilder sb = new StringBuilder();
for(int i=0; i < 13; i++) {
sb.append((char)('0' + rnd.nextInt(10)));
}
String id = sb.toString();
System.out.println(id);

caurl = "https://www.zhihu.com/captcha.gif?r="+id+"&type=login";
//下载验证码图片
URL captcha_url = new URL(caurl);
System.out.println(captcha_url);
File file = new File(desFileName);
if (file.exists()) {
file.delete();
}
try {
URLConnection con = captcha_url.openConnection();
InputStream is = con.getInputStream();
// 1K的数据缓冲
byte[] bs = new byte[1024];
// 读取到的数据长度
int len;
OutputStream os = new FileOutputStream(file);
// 开始读取
while ((len = is.read(bs)) != -1) {
os.write(bs, 0, len);
}
is.close();
os.close();
flag = true;
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
return flag;
}
  • 账号与密码:相对应的账号密码明文即可。

    相对来说,知乎是一个登录比较简单的网站,登录中只是加了一个可以在网页中寻找到的字符串和验证码,对于一些要求比较高的网站,还会对密码进行加密,以密文的形式作为参数发送至服务器,这时候就要了解加密方法对自己的参数进行加密发送。总的来说就是服务器需要什么参数你就相应的给它什么参数。

3.登录实现:

在获取所有参数之后就简单了只要按照httpclient模拟请求就可以了(原以为),代码如下:

import java.io.BufferedReader;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.io.OutputStream;
import java.io.UnsupportedEncodingException;
import java.net.MalformedURLException;
import java.net.URL;
import java.net.URLConnection;
import java.security.cert.CertificateException;
import java.security.cert.X509Certificate;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.Random;

import javax.net.ssl.SSLContext;

import org.apache.http.Header;
import org.apache.http.HttpEntity;
import org.apache.http.HttpHost;
import org.apache.http.NameValuePair;
import org.apache.http.client.ClientProtocolException;
import org.apache.http.client.config.CookieSpecs;
import org.apache.http.client.config.RequestConfig;
import org.apache.http.client.entity.UrlEncodedFormEntity;
import org.apache.http.client.methods.CloseableHttpResponse;
import org.apache.http.client.methods.HttpGet;
import org.apache.http.client.methods.HttpPost;
import org.apache.http.conn.ssl.SSLConnectionSocketFactory;
import org.apache.http.conn.ssl.SSLContextBuilder;
import org.apache.http.conn.ssl.TrustStrategy;
import org.apache.http.cookie.Cookie;
import org.apache.http.impl.client.CloseableHttpClient;
import org.apache.http.impl.client.HttpClientBuilder;
import org.apache.http.impl.client.HttpClients;
import org.apache.http.message.BasicNameValuePair;
import org.apache.http.util.EntityUtils;
import org.jsoup.Jsoup;
import org.jsoup.nodes.Document;
import org.jsoup.select.Elements;



public class ZhiHu3 {
private String mainurl = "https://www.zhihu.com";
private String email = "";
private String password = "";
private String _xsrf = "";
boolean daili = false;
HttpClientBuilder httpClientBuilder = HttpClientBuilder.create();
//CloseableHttpClient httpClient = httpClientBuilder.build();
CloseableHttpClient httpClient = createSSLClientDefault();
private HttpHost proxy = new HttpHost("127.0.0.1",8888,"http");
private RequestConfig config = RequestConfig.custom().setProxy(proxy).build();
private String useage = "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.101 Safari/537.36";
private RequestConfig configtime=RequestConfig.custom().setCircularRedirectsAllowed(true).setSocketTimeout(10000).setConnectTimeout(10000).build();

public ZhiHu3() {

}


public ZhiHu3(String email, String password) {

this.email = email;
this.password = password;
}
// client工具函数,信任对方(https)所有证书
private CloseableHttpClient createSSLClientDefault(){
try {
SSLContext sslContext = new SSLContextBuilder().loadTrustMaterial(null, new TrustStrategy() {
//信任所有证书
public boolean isTrusted(X509Certificate[] chain, String authType) throws CertificateException {
return true;
}
}).build();
SSLConnectionSocketFactory sslFactory = new SSLConnectionSocketFactory(sslContext);
return HttpClients.custom().setSSLSocketFactory(sslFactory).build();
} catch (Exception e) {
}
return HttpClients.createDefault();
}

public String getPageHtml(String url) {
String html="";
HttpGet httpget = new HttpGet(url);
httpget.addHeader("User-Agent", useage);
httpget.setConfig(configtime);
try {
CloseableHttpResponse response = httpClient.execute(httpget);
HttpEntity entity = response.getEntity();
html = EntityUtils.toString(entity, "utf-8");
httpget.releaseConnection();
} catch (ClientProtocolException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
return html;
}
/**
* 下载验证码到本地
* @param url
* @param desFileName
* @return
* @throws MalformedURLException
*/

public boolean downloaderCapter(String url,String desFileName) throws MalformedURLException {
boolean flag = false;
String page = getPageHtml(url);
Document doc = Jsoup.parse(page);
Elements capchas = doc.select("img.Captcha-image");
System.out.println(capchas.size());
if (capchas.size()==0) {
System.out.println("不需要验证码");
}else {
String caurl = "";
//生成随机数
Random rnd = new Random();
StringBuilder sb = new StringBuilder();
for(int i=0; i < 13; i++) {
sb.append((char)('0' + rnd.nextInt(10)));
}
String id = sb.toString();
caurl = "https://www.zhihu.com/captcha.gif?r="+id+"&type=login";
//下载验证码图片
File file = new File(desFileName);
if (file.exists()) {
file.delete();
}
try {
HttpGet getCaptcha = new HttpGet(caurl);
CloseableHttpResponse imageResponse = httpClient.execute(getCaptcha);
byte[] bs = new byte[1024];
int len;
OutputStream os = new FileOutputStream(file);
while ((len = imageResponse.getEntity().getContent().read(bs)) != -1) {
os.write(bs,0,len);
flag = true;
}
os.close();
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
return flag;
}
/**
* 获取_xsrf
*/

public String get_xsrf(String url) {
String page = getPageHtml(url);
Document doc = Jsoup.parse(page);
Elements srfs = doc.getElementsByAttributeValue("name", "_xsrf");
String xsrf = srfs.first().attr("value");
return xsrf;
}


public void login() throws IOException {
List<NameValuePair> para = new ArrayList<NameValuePair>();
_xsrf=get_xsrf(mainurl);
System.out.println(_xsrf);
para.add(new BasicNameValuePair("_xsrf", _xsrf));
Map<String, String> header = new HashMap<String, String>();
header.put("Content-Type", "application/x-www-form-urlencoded");
header.put("Referer","https://www.zhihu.com/");
header.put("User-Agent", useage);
header.put("X-Requested-With", "XMLHttpRequest");
header.put("Host", "www.zhihu.com");
header.put("Origin", "https://www.zhihu.com");
boolean flag = downloaderCapter(mainurl, "D:\\image.png");
if (flag) {
BufferedReader br = new BufferedReader(new InputStreamReader(System.in));
System.out.println("请输入验证码:");
String captcha = br.readLine();
para.add(new BasicNameValuePair("captcha",captcha));
}
para.add(new BasicNameValuePair("email", email));
para.add(new BasicNameValuePair("password", password));
para.add(new BasicNameValuePair("rememberme", "true"));
HttpPost httppost = new HttpPost("https://www.zhihu.com/login/email");
for (String string : header.keySet()) {
httppost.addHeader(string, header.get(string));
}
httppost.addHeader("X-Xsrftoken", _xsrf);
if (daili) {
httppost.setConfig(config);
}
httppost.setEntity(new UrlEncodedFormEntity(para,"utf-8"));
CloseableHttpResponse res = httpClient.execute(httppost);
int statuts_code = res.getStatusLine().getStatusCode();
System.out.println(statuts_code);
System.out.println(EntityUtils.toString(res.getEntity(),"utf-8"));
httppost.releaseConnection();
}




public static void main(String[] args) {

ZhiHu3 zhihu = new ZhiHu3("xxxxxxxxx@qq.com","xxxxxxxx");
try {
zhihu.login();
String html = zhihu.getPageHtml("https://www.zhihu.com/question/following");
//System.out.println(html);
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}

}

原来创建httpclient用的是httpClientBuilder.build()方法,但是在实践中发现偶尔会报关于SSL证书问题的错误,所以加了一个工具函数。登录成功后就可抓取自己想要的数据了,我这里是在登录成功后访问我的关注页面。输出返回码为200,表示登录成功。
JAVA爬虫初识之模拟登录