# wget.txt # # wget -e robots=off -o logfile.txt -v -S -x -E -m -k -K -p http://192.168.1.90/ ### As soon as this starts the terminal will lock up! # # tail -f logfile.txt ### In a different terminal start this to see the progress of the wget. It is best to have this terminal all ready and then just hit the enter key. When it is finished, the first terminal will be back and just do a "control c" (^ c) to exit the tail program. # # Since my site has 42 GB of files, this took several hours (02:23:58 or 8638 seconds) as it ran on the Raspberry Pi B Plus at 700 MHz, with 512 Mb Ram and only a one core processor. The wget site (192.168.1.90) went onto a 1 TB hard drive on my Raspberry Pi 400, wired to a 1 gigabit router, then wired to the Raspberry Pi B Plus. I only left a tiny part of "ac0xl/" in the "192.168.1.90" file, but all 42 GB were transferred. The "logfile.txt" file is what ended up being printed out with only a few very brief pauses, most of the time I could not even read part of it as it was going by so fast! In the "logs" direcory are the test logs "thttpd_log.03, 02, 01" that were built during the build and test process. "thttpd_log" is the output that was geerated as wget pulled the web site. The time is when thttpd finished the transfer with the number of bytes that were transferred. I think one will be really surprised at how fast thttpd is on an old Raspberry Pi B Plus with only 512 MB of RAM and only one 32 bit core in the processor running at 700 MHz! "thttpd" should be really fantastic running on a Raspberry Pi Zero 2 W, with a 1GHz quad-core, 64-bit ARM Cortex-A53 CPU, 512MB LPDDR2 DRAM, 802.11b/g/n wireless LAN, Bluetooth 4.2 / Bluetooth Low Energy (BLE), since it can handle files that are over 2 gigabytes. A 32 bit OS is limited to files less than 2 gigabytes. #