Translate

Wednesday, June 20, 2018

Set Raspberry Pi as Bluetooth slave

Here some notes about how to connect a Raspberry Pi 3 via Bluetooth to an Android phone.
These notes shows some suggestions about how to connect an Android phone to the Raspberry set as Bluetooth slave.
Android side I'm using an off the shelf app.

Shopping list

  • Raspberry Pi 3 B+
  • Raspbian lite
  • BlueTerm on Android Phone

Raspberry Pi setting

  • install the Raspbian lite on a micro SD card
    (since using a Raspberry Pi 3 I installed the latest, Raspbian Stretch - for older version of Raspberry I had better luck with Jessie)
  • connect a USB keyboard, a HDMI monitor and a wired LAN or use PiBackery (see this article for more details)
  • log in
  • update
    • sudo apt-get update
    • sudo apt-get upgrade
  • run raspi-config
    • sudo raspi-config
      • set hostname
      • set locale
      • set timezone
      • enable WiFi
      • enable I2C
      • enable SPI
      • enable SSH
  • reboot - now is possible to disconnect LAN cable, HDMI monitor and keyboard

Bluetooth activation as slave

  • open a terminal and login via SSH
  • activate the ISCAN service
    • sudo hciconfig hci0 piscan
  • verify if service is running
    • sudo hciconfig -a
      Should show some lines and among them :  UP RUNNING PSCAN ISCAN

      For example :
sudo hciconfig -a
        hci0: Type: Primary  Bus: UART
BD Address: B8:27:EB:F1:EB:2D  ACL MTU: 1021:8  SCO MTU: 64:1
UP RUNNING PSCAN ISCAN
RX bytes:780 acl:0 sco:0 events:50 errors:0
TX bytes:2778 acl:0 sco:0 commands:50 errors:0
Features: 0xbf 0xfe 0xcf 0xfe 0xdb 0xff 0x7b 0x87
Packet type: DM1 DM3 DM5 DH1 DH3 DH5 HV1 HV2 HV3
Link policy: RSWITCH SNIFF
Link mode: SLAVE ACCEPT
Name: 'NiRis'
Class: 0x000000
Service Classes: Unspecified
Device Class: Miscellaneous,
HCI Version: 4.1 (0x7)  Revision: 0x8b
LMP Version: 4.1 (0x7)  Subversion: 0x6119
Manufacturer: Broadcom Corporation (15)

The Raspberry Pi is configured in slave mode for Bluetooth.

The phone scanning should show the hostname set when scanning for Bluetooth.
The setting by default is NOT retained after a reboot, i.e. is necessary to re-issue the sudo hciconfig hci0 piscan command just before to start the scan on the phone.
Clicking Pair on the phone should work.

At this point is possible to install other things to handle Bluetooth via Python :
  • sudo apt-get install bluetooth blueman bluez
  • sudo apt-get install python-bluetooth
With the python bluetooth library in place will be possible to write some code to handle the bluetooth.
For now just use Blueterm to connect to the Raspberry.


Friday, June 8, 2018

Artifactory - what is it ?

Today many projects are based on different libraries and code.
Usually the code on a project is stored in some repository, like github, but usually a lot of stuff is not stored in the github for a project.
For example, in order to create your project you need to have a compiler, libraries, modules, tools, etc.
Most of them are usually available on the net but ... but there are some limits.
What happens if you want to build your project in the future ?
How can you be sure to be able to find what "now" is easily available ?

And also for a security standpoint, is always a good thing to be able to have a specific version of a library or tool for a project.

To solve this and many other issues, programs like Artifactory do exists, allowing to create a place where to store any kind of program/library/etc. and retrieve them when needed ... in order words "artifacts".
Then you can set up a project downloading all the necessary artifacts from your archive, from an OS to a library, utility, tool, etc. from your artifact archive or, Artifactory.

Artifactory can also be programmed to keep updated some artifacts automatically from different sources, keeping available different versions.
For Linux systems is also possible to link the system repository (apt-get for Debian based, yum for Cent-OS based, etc.) to artifactory.
This is extremely useful to be able to control what packages are allowed in a network where there are different systems.
The automatic update of a linux box can be redirected in the end to Artifactory.

My use

In my case I wanted a place where store everything I need in order to build a project.
For example NiRis, has a bunch of tools and libraries that usually are downloaded from different sites, like Adafruit for example.
But I wanted a more reliable way to obtain what I need.
I wanted to be able to reproduce a project in the future, without worrying if a package is not anymore available.
So basically having the packages stored in my server gives me these advantages :

  • capability to reproduce a project in the future
  • a local space where to have everything I need for a project
  • be able to reproduce a project even being offline
  • capability to backup my artifacts on Drobo

Installing it


Is possible to install an open source version of Artifactory on a server.
In my case my server already has a bunch of websites so I decided to approach the installation in another way.
I installed Docker on the server.
I leave a detailed explanation about Docker for another article, but in easy terms, is a way to have available some kind of virtual server (called container) where to run stuff.
Is a nice way to have different containers running specific applications, in this case Artifactory.
Very easy for example to run a website without worrying about other website running.

So after installing Docker I found a ready to use image for it and downloaded on my server as well and then started.
In few minutes I ended up having a nice Artifactory server running on the server.

Here some steps to do so, of course it is assumed to have installed docker on your server.

  • Be sure to have docker running (sudo service docker start
  • Copy on the server the docker image for Artifactory :
    docker pull docker.bintray.io/jfrog/artifactory-oss:latest 
  • Run the image. The command will run the Artifactory in the image making it accessible at the port 8081 on the host :
    docker run --name artifactory -d -p 8081:8081 docker.bintray.io/jfrog/artifactory-oss:latest Note ! There are two images to be used. One is called artifactory-pro and the other artifactory-oss.
    The first is the commercial version of Artifactory, the second one the open source one. The second one has some limits but can be used free. The first one is quite expensive :) 
  • To stop the Artifactory :
    docker stop artifactory
    docker rm artifactory

Adding a permanent volume

In order to keep the artifacts stored permanently is necessary to have a volume associated to Artifactory.
I choose to have the volume embedded in the docker image.


To do so :

  • Stop the current Artifactory docker
    docker stop artifactory
    docker rm artifactory 
  • Create the volume in docker :
    docker volume create --name artifactory_data
    The volume is called artifactory_data 
  • Launch docker Artifactory associating the volume created :
    docker run --name artifactory -d -v artifactory_data:/var/opt/jfrog/artifactory -p 8081:8081 docker.bintray.io/jfrog/artifactory-oss:latest 
  • Once started the container, better to clean up the docker area with :
    docker system prune -a

Use Artifactory

To use artifactory simply open a browser on the server address on the port 8081.
In my case I didn't open the port for outside, so the Artifactory is reachable only on my network ... of course using VPN is possible to work also remotely.

Stay tuned for more articles on what to do with Artifactory.


Thursday, June 7, 2018

Backup/Sync system




Ok, time to organize a backup strategy involving the Drobo unit.
The idea is to have Drobo having all the data around  the network copied on the Drobo unit, i.e. different machines do backup on Drobo.





Shopping list


In order to do so, since the majority of my machines are based on Linux, some tools will be used.
Here what is involved :

Some notes about the components.
Dropbox is used to store material that is in share on different machines.
Still I wanted to have such common material saved on my Drobo unit.

The Linux server, with Jenkins, will be the main controller for the data management/backup but some external procedures, triggered by specific machines, could integrate the backup system.

Unison is a program that allow to synchronize files among different directories and even machines.

Jenkins allows to run scheduled tasks, like cron but with much more options.
Is not a program born to handle backups or storage policies but it can be used for that.
There are some advantages to use Jenkins instead the usual scripts and cron.

  • access via web to all the functionalities
  • quick status of the jobs
  • easy to set up and maintain
  • history of the backup operations with log
Jenkins of course is not the only possible choice, but since I already have it on my server for other purposes, why don't use it ?

Policy

The most important thing is about policies.
In this specific case, a group of policies will determine what is what, what is copied, what is considered the main place for data, etc.
The Linux server is acting as main controller, deciding what/when/how to be saved on the Drobo unit via Jenkins.
i.e. the majority of the backup operations will be handled by the server.

Machines


On my network of course the most important machine is the main server, then I have the developer machine used for almost all the activities, from development to web browsing, a VPN server plus a plethora of devices, from VoIP phones, to Echos, Android and Apple devices via Wireless.
Plus a Dropbox account where I store some common material I want to be accessible on different machines.

So the main sources of data to be stored in the Drobo unit will be :
  • main server
  • VPN server
  • main Linux machine
  • Dropbox

On this schematic block is described the main structure of the system.
The server is connected to the Drobo unit and keep synchronized the material in it.
Machines on the LAN can interact with the server, adding material that the server will back up automatically.
Also a dropbox account is linked to the server, thus the server will keep a backup copy of the Dropbox content in Drobo.
Machines out the LAN usually will interact directly only with the Dropbox account. 

Here a screenshot of my Jenkins jobs related to the backup so far :


A quick view that shows how everything is OK, even some jobs had problems in the past (mainly due to testing purpose)

Monitoring 

Other than copy material from the server to the Drobo, some activities are necessary to determine if the system has problems.
One main utility used for this purpose is drobom, a python code running on the main server capable to interrogate the Drobo unit and report information.
Another piece of monitoring involve Dropbox.
It is important to know if the server has updated the data on the Dropbox account.
To do so there is a CLI utility for Dropbox for Linux.

See the article Drobo - monitoring it for more information.

Unison


Let's talk briefly about unison.
unison is quite powerful program. Basically allows to "synchronize" two directories, doesn't matter where they are.
Since it is executed from the user jenkins, because unison is called from jenkins, is important to remember that the internal archives and preferences for unison will be stored in the jenkins user area space, i.e. /var/lib/jenkins/.unison
Specifically, a the file /var/lib/jenkins/.unison/default.prf will be used to set some parameters.

For example, many directories contains some files called .DS_Store or ._.DS_Store. These are files created by the Mac OS. They don't need to be kept in sync since are generated any time a Mac access the files.
So the default.prf file contains a line :

ignore = Name *DS_Store

that instruct unison to ignore such files.

Jenkins


Jenkins is the main engine to control the backup and synchronization of my data.
There are different jobs in place to control the backup/sync procedures.
The policy indicates for each group of data where they are copied. The Jenkins jobs, runs automatically to keep the data in sync among different sources.
Here the main jobs in place (some are still under construction at this writing time).

Drobo sanity

This job is executed every night and performs a check on Drobo, in order to send warning emails if the capacity of the unit is above 80% or one of the disks needs to be changed.

See the article Drobo - monitoring it for more details.

Backup - Dropbox

This job updates the content of Dropbox into the Drobo unit.
First it checks that the Dropbox on the server is updated and in sync, if not attempt to force the sync first.
Then it copy from Dropbox into the medical archive the latest entries and then updates the entire content of Dropbox into Drobo.
It is executed every day.

Backup - Milo

Milo is the main machine, NOT the server.
This job is still under development, but the idea is to keep in sync archives from the Milo to the server, i.e. the data on Milo have higher priority.
Of course it runs ONLY if Milo is On otherwise it fails (like the screenshot above)

Backup - Opus

Opus is the main server and this is the "main" job that actually keep in sync the content on the server and Drobo.
The job is not executed every day.