Atom editor and C++

With recent update of the Atom editor from GitHub, developing C++ application just started to be more pleasant. Enhancing the editor with packages which use clang tools enables features for easy and quick program writing, good for creating proof of concept applications.

Installation

On homepage for Atom, atom.io, there are precompiled packages to download as well as the source code. I use Linux Mint, which is based on Ubuntu and uses apt package manager, so below instructions add an apt source for easy updates.

Add atom ppa repository

sudo add-apt-repository ppa:webupd8team/atom

Update apt package db and install atom

sudo apt update && sudo apt install atom

Enhancing Atom with packages

Installing packages is very easy in Atom. For C++ development I use packages which use clang and clang tools, those need to be installed separately, check apt source for clang.

List of packages

autocomplete-clang
autocomplete-cmake
autocomplete-python
autocomplete-xml
hightlight-line
highlight-selected
language-cmake
language-cpp14
linter-clang
switch-header-source
terminal-plus

The autocomplete packages are very helpful when using STL components. Linter helps quickly discover mistakes, potential bugs when typing.

I use CMake to generate the appropriate build files, so that I can just type ‘make’ and rebuild the application. Autocomplete for cmake helps a lot when adding new instructions, which sometimes are long.

Snippets

For quicker program writing, I like to use snippets. I added two for now to help me quickly setup a new program folder. One with main function and one with CMake instructions.

Adding new snippets is easy. Edit -> Snippets…

'.source.cmake':
    'Minimum CMake File':
        'prefix': 'cmake'
        'body': '''
            cmake_minimum_required(VERSION 2.8.12)
            project(main)
            set(CMAKE_CXX_STANDARD 11)
            file(GLOB SRC *.cpp)
            add_executable(main ${SRC})
        '''

'.source.cpp':
    'Minimal C++ program':
        'prefix': 'cmin'
        'body': '''
            #include <iostream>

            using namespace std;

            int main(int argc, char *argv[])
            {
                $1

                return 0;
            }

        '''

To use above snippets simply type ‘cmin’ in a C++ file for the minimal C++ program and ‘cmake’ in CMakeLists.txt for minimal CMake instructions set needed to compile the project.

Check remaining transfer limit

For two month I have been using a Broadband network connection at home. It’s an LTE connection, it’s fast and reliable. In my flat the network signal is strong.

I use Linux Mint on my laptop, so naturally I wanted to use a modem, which is working under Linux. I have a Huawei E3276 USB modem, which is working just fine.

The Internet service would be great, however my ISP does impose a transfer limit. I’ve got a contract for 15 GB of downloaded data per month. It looks like not much, but it’s enough for my usage. I work in IT, so I have enough computer and Internet access during my work. At home I just read articles and blog posts, some Youtube movies or even a bigger download.

When the transfer goes off, my connection speed is limited to 512 Kbit/s. Fortunately the connection isn’t cut off completely or I don’t have to pay additionally for bytes downloaded above limit.

To quickly check how much transfer is remaining in current month I wrote a Python script. The scripts in order to work needs vnstat Linux command line tool.

$ ./transferLimit.py 
Downloaded:	11661.48 MiB
Limit:		   11500 MiB
Remaining:	 -161.48 MiB

The script is on my BitBucket: TransferLimitCheck

Why I like Ubuntu/Linux software repositories

At home I have Ubuntu 12.04 installed on my Toshiba and I use it most of the time. Rarely I reboot my computer to Windows 7 to play some games. At work I am using a workstation with Windows 7.

Some days ago I wanted to update my software on Windows 7 at work. I use some tools, which are available for free, for instance Freemind, a mind mapping tool. In Ubuntu I would type a proper apt-get command in the terminal, but not Windows. In Windows I have to update it manually. Even when an application does provide an update mechanism, it has to be invoked manually. There is no central place or database where you hi ‘check for update of my installed apps’ and hit accept and then the software gets updated.

Some year ago, when using Linux, yes, it was a pain the ass to use some software. Now, with Ubuntu and apt-get, that problem was solved. If an application is available free it will be probably in the software repository. So, we get another extra feature: search for software. No need to search the internet for download site. When an application isn’t in main repository it can be in another repository, which can be  added to the list of sources, which apt-get searches through.

Some packages can be downloaded via a website (deb packages) and when installed, they add it’s repository information to apt-get’s config files. Next time apt-get packages list is updated, this repository will be checked too. Google Chrome does that for example, once installed apt-get checks for new versions of the web browser along with other software.

Also the problem with dependencies. Yes, some years ago, when one had to install from source, there were mountains of problems. Now, many packages, even when one has to compile and install it from source, provide routines to generate deb or rpm packages. Or the software creator maintains his or hers own repository.

In Windows, there is no central repository, or I am not aware of such mechanism. When it comes to managing software installations, then with the built-in mechanism, Linux distros are way ahead of Windows.

HP Thin Client as a small home server

I have a HP Compaq t5000 ThinClient. It’s a x86 small computer, which I decided to use as a headless small server. Currently it runs Windows XP. I want to install Debian Linux on it. In order to do this I need a storage device, because I don’t want to erase the Windows XP from the flash memory. Fortunately the flash memory is connected to the motherboard with a 44-pin IDE connector. I bought an adapter for a Compact Flash memory card.

I bought a cheap Kingston 4GB CF card, it was a little mistake to buy cheap and low performance card. When I tried to install a basic system on it, there were errors, saying that the write goes of of bounds of the device. Another error was that the inodes from ext2 file system were damaged and referenced to other. SO I bought another CF card, this time it’s a SanDisk Ultra 30 MB/s. I installed the basic Debian Linux system and it works without the I/O errors.

What I did so far. In order to create and install the basic system I used a Debian Linux 6 installed on a VM. Then I downloaded the basic system (all operations below done as root):

$ mkdir -v debian_hp
$ debootstrap --arch i386 squeeze debian_hp http://ftp.debian.org/debian/

Next I chrooted into the newly installed system:

$ mount -t proc proc debian_hp/proc
$ mount -o bind /dev debian_hp/dev
$ LC_ALL=C chroot debian_hp /bin/bash

There are some packages to install:

$ apt-get install openssh-server linux-image-2.6.32-5-486 grub

Right now I am testing the CF card solution and it is RW, later when I configure my system better I will setup it to run as a RO file system, so there won’t be any writing to the card.

Some configuration files need to be setup before copying to the CF card. The fstab contains information how to mount partitions and virtual file systems. An example file can be copied from ‘/usr/share/doc/mount/examples/fstab’, below are the entries from file, which I created:

# Format:
#                         
/dev/sda1        /               ext2    defaults,noatime        0       0
proc             /proc           proc    defaults                0       0
tmpfs            /tmp            tmpfs   defaults                0       0

Next, configuration of the network interfaces, file ‘/etc/network/interfaces’:

auto lo
iface lo inet loopback
auto eth0
allow-hotplug eth0
iface eth0 inet static
    address 192.168.0.2
    netmask 255.255.255.

The eth0 is set to static right now, because the ThinClient isn’t connected to my home network via the main switch/router. For the purpose of testing and setup I setup a static address and connect it directly to my laptop with a LAN cable.

Now lets get ready the CF card. I used fdisk and deleted all partitions (there was only one), created a new one with boot flag. Next using the mkfs.ext2 tool I created a file system on the card. Now I could copy my prepared system onto the card:

$ mkdir -v /mnt/cf
$ mount /dev/sdc1 /mnt/cf
$ cp -aR debian_hp/* /mnt/cf

After the files were copied, it’s time to install grub and configure it:

$ grub-install --root-directory /mnt/cf --recheck /dev/sdc

Chrooted into the CF card (/mnt/cf),update the grub configuration and invoke ‘passwd’ to setup root’s password:

$ update-grub
$ passwd

Exit from chroot, unmount the CF card and plugin into the ThinClient. The system boots up, there are no errors regarding I/O, but some minor errors are in the logs. Probably I didn’t configure something properly.

Ubuntu 12.04 Dual-Monitor setup with Nvidia graphic card

My laptop has two graphics cards. First, an Intel integrated video card, second Nvidia GeForce GT330M. When I had installed Ubuntu and used it as my primary OS I had mostly problems with the display output setup.

On Ubuntu 11.10 I could switch the display output configuration for the Intel card without rebooting or the need to restart the X server. Newly attached monitors or projectors were detected without a problem. When I activated Nvidia drivers and started to use the external video card I couldn’t do it any more via normal system settings. I had to use Nvidia’s tool and it almost in all configuration steps required an X server restart or a reboot, which resulted in closing running applications.

Since I am using a laptop it’s normal to change the display settings while the system is running. I couldn’t imagine myself restarting my laptop when I had to connect a projector.

I installed Ubuntu 12.04 LTS with the help of Wubi (Windows Ubuntu Installer). When Ubuntu boots up it’s running just like a native installation with access to the physical hardware. I checked the new Nvidia drivers and Nvidia settings tool. Now I am able to use TwinView, which allows me to extend my screen to a secondary display, without closing my applications. I checked with HDMI and D-Sub outputs and it works. I couldn’t setup a screen clone, but the screen extend possibility is enough.

How to setup TwinView, just open Nvidia settings window.

On the left click on the ‘X Server Display Configuration’. Plugin the external display and click ‘Detect Displays’.

A second display will show up in the middle panel, it’s disabled, click on it. From ‘Configuration’ drop-down menu choos ‘TwinView’.

Choose ‘Apply’. A confirmation window with a 15 second countdown will appear, probably on the secondary screen. In case of HDMI the monitor detected the signal quickly and displayed the picture. With D-Sub connection I had to wait a little and quickly hit the OK button to confirm that the new video settings are good.

All the changes were made on a running system with additional applications started. After changes, applications were still running and there were no problems with the extended screen. Even detecting newly attached displays without rebooting works great. No problems with projectors, or at least I hope so, because I tested this only with my external monitor.

Install Ubuntu 12.04 from Windows without repartition

After the overheat problem on Windows 7 I remembered why I liked the Linux operating system. When installed and running and when you don’t change anything, like configurations of services and applications, it runs always the same. By the same I mean that there is no problem such as the system getting old and slow. Of course, the problem of viruses, trojans and worms on Windows is too an argument to use Linux, no antivirus software.

But there were downsides of Linux. I wrote that I usually work on Windows 7 workstation, so I wanted to use the same development environment at home, to avoid getting stuck figuring out how to port my solution from one system to another. I also had some annoying hardware issues, especially with the Nvidia graphics card. Whenever I wanted to switch the output of the video, I had to restart the X server and close running applications. My Ubuntu 11.10 based Linux had Nvidia drivers and they only worked with my laptop in Xinerama mode. There was no TwinView option, I don’t know why, couldn’t get it working. The same problem was when I wanted to connect to a projector via a VGA cable, I had to reboot the computer to get it working.

Ubuntu 12.04 LTS was released, I installed it on my VMWare Player. I don’t have a native installation of Ubuntu, so I can’t check whether the graphics drivers from Nvidia were fixed to work better in Dual-Monitor setup.

Recently I watched an episode of Hak5 and Shannon was talking about Wubi. Windows Ubuntu Installer installs Ubuntu in an image file and adds an entry to the Windows 7 boot menu. When you reboot your machine you can choose Ubuntu and boot from the images installed on the Windows hard drive. The Installer does not alter the partition table and it can be uninstalled just like a regular application from Control Panel. I didn’t want to have a dual-boot setup, but the VMWare solution does not allow me to check if my hardware problems are gone. The Wubi allows me to check it, because the system behaves just like it would be installed on the hard drive. You can install new software and change the configuration, after reboot the changes are there, it isn’t like a LiveCD, it’s more like a LiveUSB with persistent data mode.

To install Ubuntu from Windows goto Windows installer for Ubuntu Desktop and download the installer.

Also the installation instructions are there, just follow the link: Installing Ubuntu with the Windows installer.

I followed the instructions and installed Ubuntu on my hard drive. It can be removed from the system via Control Panel.

The Ubuntu images are in the directory, which was provided during installation setup. I created a 20 GB disk.

I booted into the newly installed system and it allowed me to check how does my Nvidia card works. I just did a quick check and the TwinView option was available, so now I could get a lot more from my Nvidia graphic card and avoid frustration setting up Dual-Monitor or external projector.

Ubuntu 12.04 LTS on VMWare player

Today Ubuntu 12.04 LTS has arrived. I’ve switched back to Windows 7 operating system as my main OS. I don’t want to reboot into Linux whenever I need to check something or to work on Linux a little bit. Instead of this I decided that I will use a virtual machine for this task. I probably won’t do any high resource demanding tasks on my Linux VM, so it should be enough.

I used on Linux VirtualBox virtualization software. It is a good solution, but I like the solution from VMWare. They provide a free VMWare Player, in which one can create VMs and play them. Since I won’t need the advanced features from VMWare, this solution will be enough for me too.

I already installed VMWare Player and downloaded the iso file for Ubuntu 12.04 LTS. Lets install it!

1. Start VMWare Player and Create new Virtual Machine.

In the ‘New Virtual Machine Wizard’ window, select ‘I will install the operating system later’.

2. In the next window select guest OS, I chose Linux and Ubuntu 32bit, click Next.

3. Type in the name and select path for VM’s files, click Next.

4. Create virtual hard disk, I left the default settings and clicked Next, I think 20 GB is enough.

5. In the next window click on ‘Customize hardware’, attach the Ubuntu ISO file and configure network. I chose bridged network, close and Finish.

6. Start the VM, wait for Ubuntu to boot, click ‘Install Ubuntu’. Follow the installation instructions and wait for install.

7. When the installation process is finished, reboot the VM and install ‘VMWare Tools for Linux’. They will provide changing resolution when the window is re-sized, shared folders and clipboard. Choose Virtual Machine -> Install VMWare Tools… Follow the instructions.

Ubuntu 12.04 LTS installed and working, checked the network it is also working. So now, whenever I need to do something on Linux, I have a working VM.