Unfortunately, the "^/admin/" thing seems to cause an error 500 and require authorization, so the certbot fails to get a cert. Is there a way to make this work so that pihole can run and certbot renew certs in the future?
I have followed this whole guide, made the cert, used certbot, but I get stuck at the end.
After I edit the lighttpd external.conf file, i can no longer access my /admin/ of my pihole, and I tried that with all the ports.
Nuking that file fixes it for me.
Yes, I replaced all the example.com with my own fqdm
Thanks for this solution. I need it because i want to acces an other raspberry pi with domoticz on my netwerk via https. But.... (and sorry i am new)
I did step 1 and 2 but in the beginning of step 3 i am stuck
An error is shown if i copy and paste the first line of step 3:
pi@raspberrypi:~ $ sudo cat /etc/letsencrypt/live/mplace.nl/privkey.pemcat: /etc/letsencrypt/live/myfqdn.nl/privkey.pem: No such file or directory
Certbot is EFF's tool to obtain certs from Let's Encrypt and (optionally) auto-enable HTTPS on your server. It can also act as a client for any other CA that uses the ACME protocol.
The easiest way to install Certbot is by visiting certbot.eff.org, where you can find the correct installation instructions for many web server and OS combinations.
I have done everything as in the example, only that I run my own root CA, which certifies me the certificates. I have been doing this for quite some time, with various services in my private network. This also works quite well with pihole, but I got a warning, does anyone know anything about it?
mitag - I get same error message. Search resulted with it's default since lighttpd 1.4.28
Remove the ssl.use-compression = “disable” from /etc/lighttpd/external.conf and it will go away.
@WaLLy3K
Like mitag, I used my own CA - however, it fails to serve the page over SSL/TLS. I tried using @fidelito17 method and modifying/adding the “mod_alias” to lighttpd.conf as well as commenting/removing the ssl.use-compression = “disable”
This site can’t be reached dns1.domain.com refused to connect.
Over Chrome (version 67.0.3396.99) hit F12 - Security :
Certificate - valid and trusted
The connection to this site is using a valid, trusted server certificate issued by unknown name.
Resources - all served securely
All resources on this page are served securely.
The certificates do not seem to be the issue. Not sure what I'm missing here.
Please let me know if you have any ideas.
Steps taken :
Modifying the ["host"] section /etc/lighttpd/external.conf file to dns1.domain.com
Pointing the ssl.pemfile and ssl.ca-file to the respective directory
ssl.pemfile = "/home/pi/Downloads/dns1.pem"
ssl.ca-file = "/home/pi/Downloads/root.pem"
Note - have tried RSA and ECDSA based CA's, and tried changing the ssl.cipher-list; always shows the error of refusing to connect. The FQDN does have an A record, and the certificate does have the SAN for the FQDN. Created the dns1.pem via cat server.crt server.key > dns1.pem
Permissions :
pi@dns1:~/Downloads$ ls -l
total 16
-rw-r--r-- 1 pi pi 8387 Jul 15 08:46 dns1.pem
-rw-r--r-- 1 pi pi 1838 Jul 15 07:46 root.pem
lighttpd -v : lighttpd/1.4.45 (ssl) - a light and fast webserver Build-Date: Jan 14 2017 21:07:19
openssl version : OpenSSL 1.1.0f 25 May 2017
Raspbian version : Linux version 4.14.52-v7+ (dc4@dc4-XPS13-9333) (gcc version 4.9.3 (crosstool-NG crosstool-ng-1.22.0-88-g8460611)) #1123 SMP Wed Jun 27 17:35:49 BST 2018
Edit / Update : The configuration was correct (certs, external.conf, etc) - originally was using Raspbian with Desktop, swapped over to the Stretch Lite and it worked right away. Found some blurb on reddit where pihole didn't cooperate with the desktop version. While I didn't see any issue with the DNS functionality on it, it looks like it prevented the TLS to the WebUI. Figured if someone else comes across this, can be used as a reference.
I have followed the tutorial provided by @WaLLy3K, and SSL works perfectly for the Web Interface of Pi-hole. However, I am experiencing the following stated issues:
The slowdowns are only occuring on a couple of websites, but according to the post this shouldn't happen. And whenever I try to reach a blocked domain over HTTPS I don't see the Pi-hole error page, but see the following error:
I speculate that I will have to change this file every time I update PiHole now.
Then add external.conf as stated above
$HTTP["host"] == "url.FQDN.com" {
# Ensure the Pi-hole Block Page knows that this is not a blocked domain
setenv.add-environment = ("fqdn" => "true")
# Enable the SSL engine with a LE cert, only for this specific host
$SERVER["socket"] == ":443" {
ssl.engine = "enable"
ssl.pemfile = "/home/pi/.acme.sh/url.FQDN.com/combined.pem"
ssl.ca-file = "/home/pi/.acme.sh/url.FQDN.com/fullchain.cer"
ssl.honor-cipher-order = "enable"
ssl.cipher-list = "EECDH+AESGCM:EDH+AESGCM:AES256+EECDH:AES256+EDH"
ssl.use-sslv2 = "disable"
ssl.use-sslv3 = "disable"
}
# Redirect HTTP to HTTPS
$HTTP["scheme"] == "http" {
$HTTP["host"] =~ ".*" {
url.redirect = (".*" => "https://%0$0")
}
}
}
Finally, I was doing port forwarding from port 80 on this Raspberry Pi to a random port exposed to the world and I didn't remember to change it from port 80 to 443 to get SSL to work correctly. So remember kids, change your port forwarding settings.
I have a similar question. I followed the guide and it should only be enabled for my pi-hole fqdn. However, I first tested it just giving the IP address and also with the IP address I get the cert error, so I'm expecting, also any other domain will get the cert shown? Shouldn't it be only bound to the specific FQDN with the guide shown here?
I now tried a bit around and it seems, that this is the expected behavior. As HTTPS get‘s enabled, there is a response on port 443, however the response is, that there is no response (port 443 „offline“) for all other sites. However, it looks not such fine, but there is no other solution like that and it won’t expire. Only alternative solution would be to issue certs for all blocked sites via an internal CA, but then it need to be spread out to all systems and it’s no good practice. I just wonder, why the IP is available, but maybe it’s because of missing any domain name.