Categories
Linux

Ubuntu 22.04 Find Files

Find a file

find ./ -name "file_to_find.txt"  2>&1 | grep -v "Permission denied"

Find In Files (File containing text), recursively from current folder

grep -r "text_to_search_for" .
Categories
Linux localhost Node

Ubuntu 20.04 Node Server on localhost with SSL

To get SSL on your localhost for testing purposes you will need a SSL key and certificate generated. I do the following in a certs directory to call up later in the node server app.

openssl genrsa -des3 -out rootCA.key 2048
openssl req -x509 -new -nodes -key rootCA.key -sha256 -days 1024 -out rootCA.pem

Import the rootCA.pem file into your browser under the ‘Authority‘ tab.

Then create server.cnf as follows

[req]
default_bits = 2048
prompt = no
default_md = sha256
distinguished_name = dn

[dn]
C=US
ST=RandomState
L=RandomCity
O=RandomOrganization
OU=RandomOrganizationUnit
emailAddress=hello@example.com
CN = localhost

… and v3.ext as follows

authorityKeyIdentifier=keyid,issuer
basicConstraints=CA:FALSE
keyUsage = digitalSignature, nonRepudiation, keyEncipherment, dataEncipherment
subjectAltName = @alt_names

[alt_names]
DNS.1 = localhost

Using the above config files you can create the server key and certificate with the following

openssl req -new -sha256 -nodes -out server.csr -newkey rsa:2048 -keyout server.key -config <( cat server.csr.cnf )

openssl x509 -req -in server.csr -CA rootCA.pem -CAkey rootCA.key -CAcreateserial -out server.crt -days 500 -sha256 -extfile v3.ext

Now you can start the server with npm start or node server.js, with server.js as follows (basic example)

const express = require('express')
const app = express()
const https = require('https')
const fs = require('fs')
const port = 3000

app.get('/', (req, res) => {
  res.send('WORKING!')
})

const httpsOptions = {
  key: fs.readFileSync('./certs/server.key'),
  cert: fs.readFileSync('./certs/server.csr')
}
const server = https.createServer(httpsOptions, app).listen(port, () => {
  console.log('server running at ' + port)
})

Open your browser and go to https://localhost:3000 and all should be good.

Categories
Linux Node npm

Installing Node on Ubuntu 20.04

Step 1 Do the usual first, i.e.

sudo apt update

and

sudo apt upgrade

Step 2 Make sure npm is up to date as well

npm cache clean -f
npm install -g n #if not already installed
sudo n stable

It might take some doing to achieve the above, as I had a few issues to deal with ion Ubuntu 20..04 Firstly I had to update npm and node :

curl -fsSL https://deb.nodesource.com/setup_16.x | sudo -E bash -
sudo apt-get install -y nodejs

Then I had to change the permissions on the node modules directory

sudo chown -R $USER /usr/local/lib/node_modules

Only then could I get sudo npm install -g to work as intended.

Categories
Linux

Ubuntu 18.04 Disable Auto Time Updates

Firstly disable automatic times

timedatectl set-ntp off

Then simple set the time manualy, or the time itself will still automatically update.

sudo date --set "21 Dec 2020 14:42:00"

To re-enable automatic date/time updates use

timedatectl set-ntp on

To check the status use

timedatectl status

You should see something like below. the three ‘no’ mean it’s disabled.

Local time: Mon 2020-12-21 15:00:02 SAST
Universal time: Mon 2020-12-21 13:00:02 UTC
RTC time: Mon 2020-12-21 12:56:50
Time zone: Africa/Johannesburg (SAST, +0200)
System clock synchronized: no
systemd-timesyncd.service active: no
RTC in local TZ: no
Categories
Linux

Cleaning Ubuntu boot partition

Often the Ubuntu upgrades will fail fue to a no space error. The only solution is to remove old kernels manually.

The first step is to show which kernel you are currently running, as you do not want to delete this one.

sudo su
uname -r

So change to the boot directory for simplicity and list all existing kernels on the system.

cd /boot
ls -al

Then you can remove the older kernels…

apt remove linux-image-4.15.0-99-genericapt remove linux-image-4.15.0-99-generic

… and clean up.

apt --purge autoremove

You can retry the upgrade

apt full upgrade

Categories
Linux wordpress

WordPress Security

This applies to self-managed Apache2 servers. Shared servers require different permissions, for example wp-config : set that file’s permissions to 440 or 400.

Site Lockdown

File permissions to lock down website, from the websites home folder. Do this from the root directory for example and you will break your server.


chown root:root  -R * 
find . -type d -exec chmod 755 {} \;
find . -type f -exec chmod 644 {} \;

Change the folder ownership of the site to root

chown root:root -R * 

Change to wp-content/uploads (to allow uploads)

chown www-data:www-data -R * 

To edit any files via ftp change that ownership to the ftpuser:wwwdata

chown username:www-data -R * 

If you get asked for ftp details when trying to upgrade wordpress, or any plugins or themes, you need to add the following to wp-config.php

define('FS_METHOD','direct');

Refer to https://wordpress.org/support/article/hardening-wordpress/ for more details, especially those regarding MySql.

On the server install

Denyhosts

Disable root login

Install rkhunter (root kit hunter) to check for vulnerabilities.

sudo apt-get install rkhunter

Perform check with

sudo rkhunter --check --skip-keypress

or on first run

sudo rkhunter --checkall --skip-keypress

And keep it updated with

sudo rkhunter --update

For Ubuntu server you may have to “fix” /etc/rkhunter.conf

UPDATE_MIRRORS=0 to UPDATE_MIRRORS=1
MIRRORS_MODE=1 to MIRRORS_MODE=0
WEB_CMD="/bin/false" to WEB_CMD=""
Categories
Linux Raspberry PI

SSL Localhost

To enable ssl on a localhost website, and stop Chrome from showing is as “unsafe” ….

Pertains to Ubuntu 18.04 Bionic, and running Apache 2.4.29

Create localhost.cnf

HOME = .
RANDFILE = $ENV::HOME/.rnd
oid_section = new_oids

[ new_oids ]
tsa_policy1 = 1.2.3.4.1
tsa_policy2 = 1.2.3.4.5.6
tsa_policy3 = 1.2.3.4.5.7

[ ca ]
default_ca = CA_default		# The default ca section

[ CA_default ]
dir = ./localhostCA		# Where everything is kept
certs = $dir/certs		# Where the issued certs are kept
crl_dir = $dir/crl		# Where the issued crl are kept
database = $dir/index.txt	# database index file.
	# several certs with same subject.
new_certs_dir = $dir/newcerts		# default place for new certs.
certificate = $dir/cacert.pem 	# The CA certificate
serial = $dir/serial 		# The current serial number
crlnumber = $dir/crlnumber	# the current crl number
	# must be commented out to leave a V1 CRL
crl = $dir/crl.pem 		# The current CRL
private_key = $dir/private/cakey.pem# The private key
RANDFILE = $dir/private/.rand	# private random number file
x509_extensions = usr_cert		# The extensions to add to the cert
name_opt = ca_default		# Subject Name options
cert_opt = ca_default		# Certificate field options
default_days = 365			# how long to certify for
default_crl_days = 30			# how long before next CRL
default_md = default		# use public key default MD
preserve = no			# keep passed DN ordering
policy = policy_match

[ policy_match ]
countryName = match
stateOrProvinceName = match
organizationName = match
organizationalUnitName = optional
commonName = supplied
emailAddress = optional

[ policy_anything ]
countryName = optional
stateOrProvinceName = optional
localityName = optional
organizationName = optional
organizationalUnitName = optional
commonName = supplied
emailAddress = optional

[ req ]
default_bits = 2048
default_keyfile = privkey.pem
distinguished_name = req_distinguished_name
attributes = req_attributes
x509_extensions = v3_ca	# The extensions to add to the self signed cert
string_mask = utf8only
req_extensions = v3_req

[ req_distinguished_name ]
countryName = Country Name (2 letter code)
countryName_default = AU
countryName_min = 2
countryName_max = 2
stateOrProvinceName = State or Province Name (full name)
stateOrProvinceName_default = Some-State
localityName = Locality Name (eg, city)
0.organizationName = Organization Name (eg, company)
0.organizationName_default = Internet Widgits Pty Ltd
organizationalUnitName = Organizational Unit Name (eg, section)
commonName = Common Name (e.g. server FQDN or YOUR name)
commonName_max = 64
emailAddress = Email Address
emailAddress_max = 64

[ req_attributes ]
challengePassword = A challenge password
challengePassword_min = 4
challengePassword_max = 20
unstructuredName = An optional company name

[ usr_cert ]
basicConstraints = CA:FALSE
nsComment = "OpenSSL Generated Certificate"
subjectKeyIdentifier = hash
authorityKeyIdentifier = keyid,issuer

[ v3_req ]
basicConstraints = CA:FALSE
keyUsage = nonRepudiation, digitalSignature, keyEncipherment
subjectAltName = @alt_names

[ v3_ca ]
subjectKeyIdentifier = hash
authorityKeyIdentifier = keyid:always,issuer
basicConstraints = critical, CA:TRUE, pathlen:3
keyUsage = critical, cRLSign, keyCertSign
nsCertType = sslCA, emailCA

[ crl_ext ]
authorityKeyIdentifier = keyid:always

[ proxy_cert_ext ]
basicConstraints = CA:FALSE
nsComment = "OpenSSL Generated Certificate"
subjectKeyIdentifier = hash
authorityKeyIdentifier = keyid,issuer
proxyCertInfo = critical,language:id-ppl-anyLanguage,pathlen:3,policy:foo

[ tsa ]
default_tsa = tsa_config1	# the default TSA section

[ tsa_config1 ]
dir = ./demoCA		# TSA root directory
serial = $dir/tsaserial	# The current serial number (mandatory)
crypto_device = builtin		# OpenSSL engine to use for signing
signer_cert = $dir/tsacert.pem 	# The TSA signing certificate
	# (optional)
certs = $dir/cacert.pem	# Certificate chain to include in reply
	# (optional)
signer_key = $dir/private/tsakey.pem # The TSA private key (optional)
signer_digest = sha256			# Signing digest to use. (Optional)
default_policy = tsa_policy1		# Policy if request did not specify it
	# (optional)
other_policies = tsa_policy2, tsa_policy3	# acceptable policies (optional)
digests = sha1, sha256, sha384, sha512  # Acceptable message digests (mandatory)
accuracy = secs:1, millisecs:500, microsecs:100	# (optional)
clock_precision_digits = 0	# number of digits after dot. (optional)
ordering = yes	# Is ordering defined for timestamps?
	# (optional, default: no)
tsa_name = yes	# Must the TSA name be included in the reply?
	# (optional, default: no)
ess_cert_id_chain = no	# Must the ESS cert id chain be included?
	# (optional, default: no)
ess_cert_id_alg = sha1	# algorithm to compute certificate
	# identifier (optional, default: sha1)

[ alt_names ]
DNS.1 = localhost
DNS.2 = *.localhost


Then use openssl to generate the certificates in 3 steps

openssl req -new -x509 -subj "/CN=localhost" -extensions v3_ca -days 3650 -key ca.key.pem -sha256 -out ca.pem -config localhost.cnf
openssl req -subj "/CN=localhost" -extensions v3_req -sha256 -new -key ca.key.pem -out localhost.csr
openssl x509 -req -extensions v3_req -days 3650 -sha256 -in localhost.csr -CA ca.pem -CAkey ca.key.pem -CAcreateserial -out localhost.crt -extfile localhost.cnf

Edit /etc/apache2/sites-available/default-ssl.conf as below

<IfModule mod_ssl.c>
        <VirtualHost _default_:443>
                ServerAdmin webmaster@localhost

                DocumentRoot /var/www/html

                ErrorLog ${APACHE_LOG_DIR}/error.log
                CustomLog ${APACHE_LOG_DIR}/access.log combined

                SSLEngine on

                SSLCertificateFile      /home/gavin/ssl/localhost.crt
                SSLCertificateKeyFile /home/gavin/ssl/ca.key.pem

                <FilesMatch "\.(cgi|shtml|phtml|php)$">
                                SSLOptions +StdEnvVars
                </FilesMatch>
                <Directory /usr/lib/cgi-bin>
                                SSLOptions +StdEnvVars
                </Directory>

        </VirtualHost>
</IfModule>

Enable the site

sudo a2ensite default-ssl

… and restart Apache2

sudo systemctl restart apache2

As things stand you will be asked for the certificate key password used in the certificate creation process.

The go into chrome and call https://localhost. The usual warnings will appear about an untrusted site. Go to advanced and select ‘Proceed to unsafe site’

View the certificate and export it.

Got to chrome settings and import this certificate under ‘Servers’. Change the certificate settings to be able to identify sites.

In the same settings under ‘Authorities’ import ca.pem, and change it’s settings to ‘Identify Websites’

Restart chrome and all should be in order. Just deal with any issues as they may arise. An example being you might have to create a .rnd folder first, etc.

Categories
Linux Raspberry PI

Ubuntu Must Have Apps

As Open Office is part of the standard Ubuntu install, like other software installed as default, I have not listed it.

There are a number of must have applications needed to get the most out of Uubuntu 18.04, but first thing to do is change the deskop. Ubuntu’s standard is way too slow and reminds me of the bad old Microsoft Windows days.

The solution is the xfce4 Desktop.

sudo apt-get install xubuntu-desktop
Multimedia

VLC for watching movies

sudo apt-get install vlc

Kazam for taking screenshots and video capture of desktop

sudo apr-get install kazam

Gimp for editing images

sudo apt-get install gimp

OpenSCAD for CAD drawing

sudo add-apt-repository ppa:openscad/releases
sudo apt-get update
sudo apt-get install openscad
Development

Notepadqq – text editor

sudo add-apt-repository ppa:notepadqq-team/notepadqq
sudo apt-get update
sudo apt-get install notepadqq
3d Printing

Repetier-host for 3d printing – AppImage

https://www.repetier.com/download-now/
CNC Cutting

cncjs – AppImage

https://github.com/cncjs/cncjs/releases
Categories
Linux Raspberry PI

Linux miscellaneous

Pertains mostly to Ubuntu 18.04.4 and Apache 2.4.29

Get Ubuntu version

lsb_release -a

Get Apache version

apache2 -v

Get MySql version

mysql --version

Find Files and hide all permission denied messages

find /start_directory -name filename 2>/dev/null

Copy file with ssh from server to localhost

scp user@server:/directory_from/filename.txt /local_directory/

Copy file from localhost to server with ssh

scp file_to_copy.txt user@servername:/directory_to_copy_to/

Copy directory from server recursively with ssh

scp -r user@server:/directory_from /local_directory/

Copy directory to server from localhost recursively with ssh

$ scp -r /local_directory/ user@server:/server_directory/

Copy directory with ftp recursively into current directory

sudo wget -r ftp://user:password@server/folder1/folder2/filename

To use ftp to put files onto a server recursively

ncftpput -R -v -u user -p password -P 21 ftp.server.com /directory_to_copy_to /directory_to_copy_from

Import .sql into MySql

mysql -uUser -pPassword Database < sql_file.sql

To export MySql table to .sql file

sudo mysqldump database table -uUsername -pPassword > sql_file.sql
Categories
Linux Viynl Cutter

Sharing a Viynl Cutter over the Network in Linux

This setup works for a cheap Chinese no name brand Cutting Plotter I use.

On the host PC add a new printer and set the printer model to “Raw Queue

Set the device URI to, for example, serial:/dev/ttyUSB0?baud=9600. This assumes the servial device is attached to USB port 0.

Make sure it is shared.

On the slave PC simply add the network printer

Enjoy.