Today an article on a somewhat "technical" topic that certainly does not interest everyone: As part of my "real" work, I have to read the HTTP header every now and then that websites send when certain requests are made. For example, you can find out whether a 404 error page from a website actually sends a 404 status code or whether it just reports an error but actually sends a status code 200 OK.
So far I have always used online services for this, which can be found on the net under the term "web sniffer" or something similar. But you can also do the thing very easily with the on-board tools of OS X by using the terminal.
For example, if I want to query which headers the domain www.sir-apfelot.de reports, then I enter this command in the terminal:
curl --HEAD https://www.sir-apfelot.de
That works very well for me, but sometimes there are problems when the server doesn't understand the HEAD command and then doesn't send any headers to curl. In this case, the complete output can also be output when the page is called up with the verbose command "-v". The things that interest us then appear at the top. But don't be surprised, you get the complete source code of the page displayed in the terminal, which can be a lot of letters here and there. :)
The correct command for this "complete edition" is:
curl -v https://www.sir-apfelot.de
Effectively for free: iPhone 13 Mini and iPhone 13 deals with top conditions at Otelo - Advertisement
Jens has been running the blog since 2012. He appears as Sir Apfelot for his readers and helps them with problems of a technical nature. In his free time he drives electric unicycles, takes photos (preferably with his iPhone, of course), climbs around in the Hessian mountains or hikes with the family. His articles deal with Apple products, news from the world of drones or solutions for current bugs.