Crawler has trouble logging in.
I am trying to make a program that crawls the front page of a site and gets stats like number of comments on each post, how long it took a post to get to front page, upvotes, etc.
I'm having a little trouble finding out how to login/navigate correctly.
When I pass it real login credentials, it doesn't print anything to console, however, when I pass fake credentials I get the HTML of the login page. To me, this means that the code is logging in correctly, but I can't quite tell.
My biggest question is after the program logs in, how do I direct it to the front page?
Code :
public static void connect(String queryString1, String queryString2) throws IOException
{
// Create query string
String queryString = "user=" + URLEncoder.encode(queryString1, "UTF-8");
queryString += "&passwd=" + URLEncoder.encode(queryString2, "UTF-8");
// Make connection
URL url = new URL("https://ssl.reddit.com/post/login");
URLConnection urlConnection = url.openConnection();
urlConnection.setDoOutput(true);
OutputStreamWriter out = new OutputStreamWriter(
urlConnection.getOutputStream());
// Write query string to request body
out.write(queryString);
out.flush();
// Read the response
BufferedReader in = new BufferedReader(
new InputStreamReader(urlConnection.getInputStream()));
String line = null;
while ((line = in.readLine()) != null)
{
System.out.println(line);
}
}