I am not familiar with much coding, but I am looking for a way to download an entire website domain and all the children as html files in a directory on my computer. For example, I use the code with a specified website, it starts a download process, once it is done, I look in my folder and there is the entire website in the form of html files.
I had previously been using the wget command (I am on a mac) and it is so helpful. However, I cannot stand trying to get every single html file with that one by one for a domain. I primarily want to download some wikis. If somebody can help me get a code that can do this that would be very helpful.