Quick one-page serving web server
This snippet uses nc (netcat) to strictly serve index.html
over port 8888. Parameterize or modify to your liking.
while true; do
echo -e "HTTP/1.1 200 OK\nContent-Type: text/html\n\
Content-Length: $(du -b index.html | awk '{print $1}')\n\
Connection: close\n\n $(cat index.html)\r\n" |
nc -lp 8888
sleep 1
done
HTML listing via Tree
Build a one-page directory listing of your entire site, sorted by mod time.
tree -I index.html --matchdirs -t -H base_url \
-T "Dir Listing" --noreport . > tree.html
The few key arguments to the tree
command:
-H base_url
causes the output to assume an HTML form.base_url
can be an empty string in quotes.-I
black-boxes a pattern-P
white-boxes a pattern-t
sorts by modification time-R
the recursive option, causes tree to rerun at each level and generate an appropriate index (00Tree.html), not to collide with any index html you may already have.-L 1
limits each level’s rendered search depth to 1.
Create a recursive html directory listing for your entire site.
tree -R -L 1 --dirsfirst -H base_url \
-T "Dir Listing" --noreport . > tree.html
Create a test html directory hierarchy with index.html files
# Create the directories
mkdir -p top{1,2}/middle{1,2}/bottom{1,2} -v
for dir in $(find -type d); do echo "$dir: hello" > $dir/index.html; done
# Or
echo "hello" > index.html
find -type d -exec cp -v index.html {} \;
Curl
Log into site via HTTP POST and save cookie.
curl --cookie-jar cookie.txt 'http://www.server.com' \
-d "user=<user>&password=<password>"'
Download private_file.zip via the previously saved cookie. Follow any redirect.
curl --cookie cookie.txt --location \
"http://www.server.com/private_file.zip" -O
Other HTTP POST alternatives:
curl -v <url> -d @data_post.txt
w3m -post data_post.txt <url>
Questions, comments? Connect.