Basics Of Web Interaction

Created On 23. Feb 2020

Updated: 2023-02-17 01:29:33.107416000 +0000

Created By: acidghost

A URL is composed from more parts. Following this a request is formed which delivers what is requested in the URL. There are multiple tools and ways to send and change requests. I advise to look up here into an explanation what different parts of an URL mean.

netcat

Netcat can send web requests by having the method, request-uri and HTTP-Version passed in.

POST / HTTP/1.1
Host: 127.0.0.1
Content-Length: 78
Content-Type: application/x-www-form-urlencoded

Other parameters have to been passed in manually as well. In such case, curl can be helpful in parsing it correctly. Netcat also requires for the URL parameters to be URL encoded.

curl

Composing web requests through terminal is in most with curl as first choice. Unlike with netcat, curl doesn't require for such parameters as Content-Length or HTTP-Version to be hardcoded. With the flag -v it will disclose information on a current request.

A curl request with some JSON data might look like this:

curl -H "Content-Type: application/json" -d '{"key a":"value", "key b":{"key c": "value", "key d": ["value", "value"]}}' http://127.0.0.1 

python

With python a similar request can be crafted in such way:

newHeader={'Content-type': 'application/json', 'Accept': 'text/plain'}
requests.get("http://127.0.0.1/", data=[('key a', 'value'), ('key b', 'value')], headers-newHeader).text

This was possible with the requests library. Let's take a look at another python library, Urllib2:

import urllib2
body = urllib2.urlopen("http://example.com")
print body.read()

We are sending a simple get request to our website. However this is just a raw request, which means nothing on the side like JavaScript will be executed.
We can modify the request headers with urllib2 as follows:

 headers = {
 'User-Agent' : 'Mozilla/5.0 (X11; Kali; Linux x86_64;
 rv:41.0) Gecko/20100101 Firefox/41.0'
 }
 request = urllib2.Request("http://example.com",
 headers=headers)
 url = urllib2.urlopen(request)
 response = url.read()

We pass additional parameters this way:

import urllib
 fields = {
 'name' : 'Jon',
 'email' : 'Jon@example.com'
 }
 parms = urllib.urlencode(fields)
 u = urllib.urlopen("http://example.com/login?"+parms)
 data = u.read()
 print data

You see above a GET request by just appending +params
Send a POST by separating it:

import urllib
 fields = {
 'name' : 'Jon',
 'email' : 'Jon@example.com'
 }
 parms = urllib.urlencode(fields)
 u = urllib.urlopen("http://example.com/login", parms)
 data = u.read()
 print data

Here we create a custom opener with cookies enabled:

 opener = urllib2.build_opener(
 urllib2.HTTPCookieProcessor()
 )

and create the request:

 request = urllib2.Request(
 "http://example.com/login",
 urllib.urlencode(fields))

Send a login request:

 url = opener.open(request)
 response = url.read()

Now we can access the private pages with the cookie we got from the above login request:

 url = opener.open("http://example.com/dashboard")
 response = url.read()

Section: Web

Back