Jedv

D3 project 2 – Public DNS World Nodes Map (Part 1)

I have been working on my final project for my time with D3. I decided to focus purely on the (Geo|Topo)JSON style world maps and try to parse then display some sort of network information using a combination of geoip lookups and longitude/latitude.

For the first part of this process, I needed to find some data and get it ready for use. I stumbled across this website: https://public-dns.info/.

Public DNS servers are globally indexed on this site with some basic information on their reliability, reverse DNS, software version etc & most of all their IP – perfect for what we will need.

I downloaded the full CSV list of all ‘valid’ nameservers present, which is located at https://public-dns.info/nameservers.csv – there is many thousands of nameservers listed at the time of writing. This is too many for a nice looking node map, so I use some GNU/Linux trickery to get a randomly selected csv of 50 nodes.

$ shuf -n 50 nameservers.csv > c.csv

Make sure the CSV header is copied to the top of your new c.csv file – this is important for the next step which is to get Geographical locations for these different IP addresses.

$ cat c.csv
ip,name,country_id,city,version,error,dnssec,reliability,checked_at,created_at
161.0.153.46,,TT,Port of Spain,dnsmasq-2.49,,false,1.00,2019-02-27T13:00:21Z,2017-01-08T22:03:27Z
201.76.162.156,mvx-201-76-162-156.mundivox.com.,BR,Rio de Janeiro,,,false,0.99,2019-02-27T12:54:31Z,2015-01-10T15:46:41Z
.........

I chose to use the https://api.ipstack.com service to get my geoip information but there is a few services around, ipstack lets you call a certain amount of times a day for free and we only need 50 calls to get our data.json file ready for D3.

I used Node.JS along with a couple of modules from npm to hit up the API and make a data.json file neatly organised with all the data we need.

$ npm install --save request request-promise-native csvtojson

Once you have the required modules, its as simple as a few neat lines of Node.js – this file is parse_csv.js.

/* Parse the DNS public node list to JSON, then collect GeoIP data for each IP.
 * By Jed V. (2019 root@livingthedream.fun) */
const csv = require('csvtojson');
const fs = require('fs');
const request = require('request');
const rp = require('request-promise-native');

const orig = 'c.csv';

console.log(`Parsing file: ${orig}`);

function initial_csvtojson() {
    csv()
        .fromFile(orig)
        .subscribe((json, line) => {
            return new Promise((resolve, reject) => {
                // long operation for each json e.g. transform / write into database.
                console.log(line);
                console.log(json['ip']);
                var options = {
                    uri: `http://api.ipstack.com/${json['ip']}?access_key=********************`,
                    json: true
                };
                rp(options)
                    .then(function(data) {
                        console.log(`IP: ${data.ip}, Long: ${data.latitude},${data.longitude}`);
                        json['longitude'] = data.longitude;
                        json['latitude'] = data.latitude;
                        console.log('Changed obj:');
                        console.log(json);
                        resolve();
                    })
                    .catch(function(err) {
                        reject();
                    });
            });

        }).then((json) => {
            console.log('Writing JSON object to file...');
            fs.writeFile('./data.json', JSON.stringify(json, null, 2), 'utf-8');
        }).catch((err) => {
            throw err;
        });
}

initial_csvtojson();

Make sure to replace the *s with your ipstack.com API key if you wish to use the above script. We use the csvtojson module with its native promise support alongside request module’s native promises to effectively quickly turn c.csv into data.json.

We make the ipstack.com API request for each row in the CSV and then append the returned longitude/latitude to the object which csvtojson then passes on until the final data.json object is created and copied to file – which then looks like the snippet below:

[{
        "ip": "161.0.153.46",
        "name": "",
        "country_id": "TT",
        "city": "Port of Spain",
        "version": "dnsmasq-2.49",
        "error": "",
        "dnssec": "false",
        "reliability": "1.00",
        "checked_at": "2019-02-27T13:00:21Z",
        "created_at": "2017-01-08T22:03:27Z",
        "longitude": -61.5167,
        "latitude": 10.65
    },
    {
        "ip": "201.76.162.156",
        "name": "mvx-201-76-162-156.mundivox.com.",
        "country_id": "BR",
        "city": "Rio de Janeiro",
        "version": "",
        "error": "",
        "dnssec": "false",
        "reliability": "0.99",
        "checked_at": "2019-02-27T12:54:31Z",
        "created_at": "2015-01-10T15:46:41Z",
        "longitude": -43.3307,
        "latitude": -22.9201
    }
    {.........}
]

Great, in just a couple of motions we have got a data.json file that we can use in the browser with D3.js to produce a world map of highlighted public DNS nodes.

I’m going to write up how I did this in part 2!

Jedv

Rsync backup script for websites (document root) & MySQL databases – to rsync.net or others

I have just got an account with rsync.net to backup one of my server’s websites, including the document root for the webserver and all the MySQL databases for them. I made a little more fancy Bash script for storing everything that is a little more complex than just one rsync command and caters for the dumping of the databases to individual named/dated files.

Read the header comments of the file for instructions on usage.

#!/bin/bash
# Backup a directory (document root) and all MySQL databases and send to rsync.net.
# You must have an account with rsync.net backup service (or similar) and have setup automatic ssh login via keys.
# See: http://www.rsync.net/resources/howto/ssh_keys.html
# Create the remote RSYNC_PATH using ssh before running the script.
# It is best to run this with a cronjob either daily/weekly/monthly,
# run it with --quiet in the crontab for no output except from errors
# <root@livingthedream.fun>

RSYNC_HOST=""
RSYNC_USER=""
RSYNC_PATH="" # Do not start or end the path with a /
MYSQL_USER=""
MYSQL_PASSWORD=""
DOCUMENT_ROOT_PATH="" # Do put trailing slash on this path.

# No need to edit below lines unless you need to modify behaviour.

databases=`mysql -u $MYSQL_USER -p$MYSQL_PASSWORD -e "SHOW DATABASES;" | tr -d "| " | grep -v Database`

QUIET=false
RSYNC_OPT="-avH"
if [ "$@" ] && [ "$@" == "--quiet" ]; then
 QUIET=true
 RSYNC_OPT="-avHq"
 echo "Starting backup.."
 echo "Dumping all MySQL databases.."
fi
mkdir -p sql
for db in $databases; do
    if [[ "$db" != "information_schema" ]] && [[ "$db" != "performance_schema" ]] && [[ "$db" != "mysql" ]] && [[ "$db" != _* ]] ; then
        if [ "$QUIET" = false ]; then
          echo "Dumping database: $db"
        fi
        mysqldump -u $MYSQL_USER -p$MYSQL_PASSWORD --databases $db > sql/`date +%Y%m%d`.$db.sql
    fi
done
if [ "$QUIET" = false ]; then
  echo "Done."
  echo "Sending SQL databases to $RSYNC_PATH/sql"
fi
rsync $RSYNC_OPT sql/ $RSYNC_USER@$RSYNC_HOST:$RSYNC_PATH/sql/
if [ "$QUIET" = false ]; then
  echo "Done."
  echo "Backing up document root to $RSYNC_PATH/sites"
fi
rsync $RSYNC_OPT $DOCUMENT_ROOT_PATH $RSYNC_USER@$RSYNC_HOST:$RSYNC_PATH/sites/
if [ "$QUIET" = false ]; then
  echo "Done."
  echo "All Done."
fi

Make sure to fill in the variables at the top and setup your cron job once you have tested it – checkout rsync.net for their awesome setup and nicely priced remote backup solutions. Their technical support is also most helpful.