ongoing additions, changes, and fixes

This commit is contained in:
Yuri Slobodyanyuk
2022-03-12 16:22:23 +02:00
parent a2f3862fca
commit 0d37f19a23

View File

@@ -44,7 +44,6 @@ Use `-s` option to make it silent:
----
curl -o index.html -s https://yurisk.info
----
<hr>
== Download a web page via GET request setting Chrome version 74 as the User-Agent.
@@ -54,7 +53,7 @@ Use `-A` to set User-Agent.
curl -o Index.html -A "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/74.0.3729.169 Safari/537.36" http://example.com
----
Resources: <a href="https://developers.whatismybrowser.com/useragents/explore/" target="_blank" rel="noopener"> https://developers.whatismybrowser.com/useragents/explore/</a>
Resources: https://developers.whatismybrowser.com/useragents/explore/
== Download a web page via GET request setting Googlebot version 2.1 as the User-Agent.
@@ -71,6 +70,8 @@ curl -k -o Index.html https://example.com
== Download a page using SOCKS5 proxy listening on 127.0.0.1 port 10443
Useful when you have set up a SSH tunnel to remote server listening on local
port, say 10443.
----
curl -x socks5://localhost:10443 https://yurisk.info
@@ -82,12 +83,16 @@ curl -x socks5://localhost:10443 https://yurisk.info
curl -x socks5h://localhost:10443 https://yurisk.info
----
The idea here is to tunnel DNS requests to the remote end of the tunnel as well, for example for privacy concerns to prevent <a href="https://en.wikipedia.org/wiki/DNS_leak" target="_blank" rel="noopener">DNS leak</a>.
The idea here is to tunnel DNS requests to the remote end of the tunnel as well,
for example for privacy concerns to prevent
https://en.wikipedia.org/wiki/DNS_leak.
== Download a page and report time spent in every step starting with resolving:
Source: <a href="https://stackoverflow.com/questions/18215389/how-do-i-measure-request-and-response-times-at-once-using-curl" target="_blank" rel="noopener"> Stackoverflow</a>
Source:
https://stackoverflow.com/questions/18215389/how-do-i-measure-request-and-response-times-at-once-using-curl.
- Step 1: Put the parameters to write into a file called say _curl-params_ (just for the convenience instead of CLI):
@@ -123,8 +128,13 @@ curl -w "@curl-params" -o /dev/null -s https://example.com
== Resolve IP address to the owner's Autonomous System Number
Do so by sending POST query with form fields to the Team Cymru whois server
When sending any POST data with form fields, the first task is to get all the fields. The esiest way to do it is to browse to the page, fill the form, open the HTML code and write down fields and their values. I did it for the page at <a href="https://asn.cymru.com/" target=_blank rel="noopener">https://asn.cymru.com/</a> and noted 5 fields to fill with values, the field to place IP address to query for is `bulk_paste`. In curl you specify field values with `-F 'name=value'` option:
Do so by sending POST query with form fields to the Team Cymru whois server.
When sending any POST data with form fields, the first task is to get all the
fields. The esiest way to do it is to browse to the page, fill the form, open
the HTML code and write down fields and their values. I did it for the page at
https://asn.cymru.com/ and noted 5 fields to fill with values, the field to
place IP address to query for is `bulk_paste`. In curl you specify field values
with `-F 'name=value'` option:
----
curl -s -X POST -F 'action=do_whois' -F 'family=ipv4' -F 'method_whois=whois' -F 'bulk_paste=35.1.33.192' -F 'submit_paste=Submit' https://asn.cymru.com/cgi-bin/whois.cgi | grep "|"
@@ -133,11 +143,10 @@ curl -s -X POST -F 'action=do_whois' -F 'family=ipv4' -F 'method_whois=whois' -
Output:
----
<PRE>AS | IP | AS Name
AS | IP | AS Name
36375 | 35.1.33.192 | UMICH-AS-5, US
----
Resources: <a href="https://ec.haxx.se/http/http-post" target=_blank rel="noopener">https://ec.haxx.se/http/http-post</a>
== Make sure Curl follows redirections (`Location:`) automatically, using the correct `Referer` on each redirection
@@ -175,7 +184,8 @@ Note: this option causes curl to sent `Accept-Encoding: gzip` in the request.
== Verify CORS functionality of a website
----
curl -H "Access-Control-Request-Method: GET" -H "Origin: http://localhost" --head https://yurisk.info/2020/03/05/fortiweb-cookbook-content-routing-based-on-url-in-request-configuration/pic1.png
curl -H "Access-Control-Request-Method: GET" -H "Origin: http://localhost" \
--head https://yurisk.info/2020/03/05/fortiweb-cookbook-content-routing-based-on-url-in-request-configuration/pic1.png
----
Output:
@@ -285,6 +295,10 @@ Note: curl checks `~/.ssh/known_hosts` file to verify authenticityy of the remo
----
curl -s http://whatismyip.akamai.com/
----
.Output:
----
87.123.255.103
----
@@ -401,7 +415,8 @@ If a website has a repeating pattern in naming its resources, we can use **URL g
_Output files_: curl remembers the matched glob patterns and we can use them with `-o` to specify custom output filenames.
1. Fetch all pages in https://yurisk.info/category/checkpoint-ngngx<i>NNN</i>.html where _NNN_ goes from 2 to 9. Pay attention to the single quotes - when using on the Bash command line, the range `[]` and list `{}` operators would be otherwise interpreted by the Bash itself instead of curl.
1. Fetch all pages in
`https://yurisk.info/category/checkpoint-ngngx<i>NNN</i>.html` where _NNN_ goes from 2 to 9. Pay attention to the single quotes - when using on the Bash command line, the range `[]` and list `{}` operators would be otherwise interpreted by the Bash itself instead of curl.
----
curl -s -O 'https://yurisk.info/category/checkpoint-ngngx[2-9].html'
@@ -411,7 +426,6 @@ curl -s -O 'https://yurisk.info/category/checkpoint-ngngx[2-9].html'
Output directory:
----
ls
checkpoint-ngngx2.html
checkpoint-ngngx3.html
checkpoint-ngngx4.html