Categories
Linux Open Source Technology Ubuntu

Raspberry PI4 upgrade to bullseye and LXC notes

So whilst guides like this one: https://www.tomshardware.com/how-to/upgrade-raspberry-pi-os-to-bullseye-from-buster are very useful there were a couple of extra things that I needed to fix.

  1. I needed to update /etc/apt/sources.list.d/raspi.list in addition to /etc/apt/sources.list – changing buster to bullseye
  2. LXC config for networking caused issues to the networking

Point 1 is explained above but point 2 took me a while to figure out what was wrong. And this is only really relevant if you are using LXC (Linux Containers – a lightweight precursor to Docker / K8s). I am documenting this for anyone else who might be seeing issues (or my forgetful future self!). Also note that trying to define the static IP via dhcpcd.conf didn’t work (although perhaps as I was trying to configure eth0 rather than lxcbr0?!)

# interfaces(5) file used by ifup(8) and ifdown(8)

# Please note that this file is written to be used with dhcpcd
# For static IP, consult /etc/dhcpcd.conf and 'man dhcpcd.conf'

# Include files from /etc/network/interfaces.d:
source-directory /etc/network/interfaces.d

# attempting to configuring eth0 here 
# like the below will cause multiple errors!
auto eth0
iface eth0 static
    address 192.168.x.y
    netmask 255.255.255.0
    gateway 192.168.a.b
 
auto lxcbr0
iface lxcbr0 inet dhcp
    bridge_ports eth0
    bridge_fd 0
    bridge_maxwait 0
    
# wifi
allow-hotplug wlan0
iface wlan0 inet dhcp
        wpa-ssid SSID
        wpa-psk KEY

What it needs to look like (the static IP part is configured within the LXC definition).

# interfaces(5) file used by ifup(8) and ifdown(8)



# Include files from /etc/network/interfaces.d:
source-directory /etc/network/interfaces.d

#eth0 - built in ethernet is configured via the LXC bridge
# DO NOT CONFIGURE IT SEPERATELY OR networking and LXC will give errors

auto lxcbr0
iface lxcbr0 inet static
    bridge_ports eth0
    bridge_fd 0
    bridge_maxwait 0
    address 192.168.x.y
    netmask 255.255.255.0
    gateway 192.168.a.b

x.y and a.b are replaced by your actual addresses of course. Hope this helps someone (and I remember it if I need it again!)

Categories
Gadgets Linux Open Source Technology Web Development Wireless

Blog back online and now running on a Raspberry Pi 0 W

Since I finally got Gigaclear Fibre to the Premises (FTTP – which took over 18 months from when I placed my order) my blog has been offline. This was due to the fact that my static IP is associated with the FTTC (regular Fibre to the Cabinet) Broadband – which I am keeping as a backup/ secondary WAN as its only costing me a couple of quid more than land line rental. Getting FTTP mean’t getting a new router and I decided it was a good time to rethink my home network and implement VLANs to segregate different uses of my home network. Bought a very flexible little Dual WAN router/switch that I am super pleased with that allows me to seperate IoT from my main network (which is security best practise as IoT can have some horrible security holes). Now need to upgrade my main switch to a managed switch so I can implement VLANs throughout the network (and I am still wondering what to do about WiFi (without an expensive upgrade to VLAN aware Access points – as Gigaclear threw in a pretty decent Wifi Mesh system) At the moment I am running 2 Wifi networks – but that probably needs a rethink at some point.

As for the Raspberry PI Zero W – amazing that something so small (and powered off a router USB port!) can power a WordPress blog. Sure its not going to handle loads of requests (but then my blog never gets that!). I’ve also switched from Apache2 to Nginx. Now my web server is totally separate and decoupled from the rest of my Home Lab and virtual servers etc (which have become more experimental over the last few months with the new job).

I’ve also enabled HTTPS using a properly signed cert from LetsEncrypt. CertBot is amazingly easy to use – highly recommended.

Blog is still pretty broken in places, might get around to fixing that at some point!!!

Categories
Linux Open Source Ubuntu

Upgraded to Ubuntu Server 18.04 LTS

Was remarkably easy to upgrade over SSH (which is good as I am not in the Linux command line world as often these days!) https://wiki.ubuntu.com/BionicBeaver/ReleaseNotes

Have also applied many of the hardening steps at:

https://blog.ubuntu.com/2018/07/30/national-cyber-security-centre-publish-ubuntu-18-04-lts-security-guide

Well worth a read if you are running Ubuntu and want to improve security. Not mentioned on there (probably as its more about using Ubuntu as an end user device rather than a server) is to review the SSH config and harden the SSH service.

Categories
Linux MS Windows Vista 7, 8 etc Open Source Technology

Windows 8 will drop the start menu – is this the beginning of the end for MS OS dominance?

So Microsoft is dropping the start button from Windows in v8

I think this is a silly move – surely one of the things that keep people tied to Windows is the fact that they know how to use it. If they (badly) copy Mac / Linux and force people to re-learn how to navigate the OS won’t more people just switch to Mac/iPad and Linux? Especially given Android’s recent successes, and the continuing Apple obsession?

Just need Google to ditch the cloud obsession from their Chromebook / Chrome OS or create an Android for PCs to accelerate it…

I hope for MS sakes they keep an option in to make the OS look like Windows 7 – eg a basic theme?

Categories
Linux Open Source Sony Stuff Ubuntu

Streaming HD content to your TV via a PS3 from a Linux Server (or Windows PC)

Quick post – caveat haven’t had a chance to proof read this one and its late so it will have to do for now!

You may not be aware that you can use a Playstation 3 to act as a media streaming/playback client using a system called DNLA (also known as uPNP). This allows you to view content on your computer on your main TV in HD. Windows Media Player can act as the “Server” portion but its not ideal for connecting to the Playstation.

Crude diagram here, might expand this with my full setup when I get a chance:

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

I have been trying to get this working for a while. Essentially the plan is to get access to downloaded videos, videos from my camcorder (now HD) to save burning it to DVD or Blu-Ray and also access my photos and music collection from my Ubuntu Linux server that holds all my content (on a RAID 1 mirrored disk setup) to my TV and home cinema/HiFi setup.

Last time I tried to use a small command line utility and my PS3 was only connected via Wireless to the Server – the result was stuttering music let alone videos. So its something I gave up on for the time being.

Recently I have been able to overcome this as I have discovered flat gigabit ethernet cables that I can run out of my double glazed windows (even when shut!) so I have hacked a gigabit backbone that connects my TV and AV kit (including PS3) to my Linux Server (in fact the very one that served this blog page to you) that hold gigabytes of multimedia (now there’s a word you don’t hear much these days!)

Also discovered http://code.google.com/p/ps3mediaserver/ which is a great Java based server component for PNP based streaming – as the name suggests its specifically designed for connecting the PS3 up to content…

Screenshot of ps3mediaserver

 

 

 

 

 

 

 

 

Hey presto excellent quality video (including 1080p video) and music on the TV / HiFi!

Next to work out how to get my iTunes (stuff that only plays on iTunes rather than MP3s) music across and available to the PS3. I have moved and shared the my iTunes media folder (as have that on the network too – as per these instructions – http://lifehacker.com/230605/hack-attack-share-your-itunes-music-library-over-your-home-network - so I can re-use iTunes across different machines – and keep it backed up).

Categories
Architecture and Strategy Open Source Technology

Is “big bang” IT dying? Replaced with iterative approaches and best of breed Open Source?

This is an article I have been stewing on for a while and having recently changed from a consultancy largely working on public sector IT projects back to a private sector IT department its given me several different view points.

I also recently attended the excellent Zapthink SOA and Cloud course in Amsterdam – so I am now a Licensed/Certified Zapthink Architect!

Zapthink course in Amsterdam
Zapthink course (creating a SOA implementation roadmap), my colleague Martin is on the left. FB have changed their access to photos outside FB so this no longer works 🙁

Time for a change?

In the continued difficult financial climate will organisations continue to have the appetite and budget to  invest in large scale greenfield COTS (Commercial Off the Shelf) IT projects and licensing? e.g. Large scale commercial enterprise systems such as ERP? And what’s the next success for Open Source Software (OSS)?

Is the future a more incrementally / agile delivered open source, best of breed systems? Rather than big monolithic, generic packaged software that does everything ok but doesn’t excel at much if anything. And worst of all, often requiring the business to change its processes to fit the software. Of course the lines between commercial software and Open Source are becoming more blurry – with “Commercial Open Source” (in other words commercially backed and supported).

I am thinking here of solutions that are developed on Open Standards / common platforms (eg J2EE) using common / standards based middleware and the XML family of technologies  to connect them together. Of course there is a risk that if you pick and choose lots of niche software that serves its job well then you can end up with a big mess of spaghetti integration and duplication. But that is where effective Architecture, standards and Governance comes in; to keep things on the right track and aligned with business priorities.

Certainly the agile (iterative) methodology seems to be taking hold in larger companies, although waterfall still seems to be favoured in government – due to the perception that it will result in a fixed cost.  Unfortunately too often it doesn’t deliver successful results as its too rigid, ends up costing far more through cunning use by the vendor of change control and depending on the project the initial build can be as little as 10% of the total costs in any case.

What about the cloud? Isn’t that supposed to reduce costs…

I think many in the IT industry (well some vendors anyway) right now would argue that the answer to this is delivery via the cloud using a pay as you need it service based model (to get away from having to make the big upfront investment in hardware and licensing). I guess this is an option but I think most large businesses (who have the budgets for the larger IT projects) are looking at the cloud quite sceptically, waiting for it to mature beyond e-Commerce and online type applications and add the required security and reliability that is needed. Keeping things in their own data centre and exploiting virtualisation to optimise costs at the Infrastructure layer. Cloud as your Disaster Recovery (DR) / Data Archiving environment looks like one of the most compelling use case so far.

I am seeing some suggestions that organisations would like to adopt this approach in some areas (eg Integration). In fact one of the places I worked in the past built its own home grown ERP / eLearning platform on Open Source. In my current role we are looking at Open Source alternatives – particularly for Integration and Infrastructure glue.

Its interesting to see how the adoption of Open Source has matured – from just the Linux OS used for servers, Linux + Apache for static web moving towards LAMP and other Apache projects such as Tomcat etc even more so with “Web 2.0”. Data Integration / ETL is a big area for OSS – eg Talend, ActiveMQ, Glassfish. J2EE is a big success story too.

And of course now with Android OSS has finally come into contact with the casual end user (rather than the techies like me that run Linux on the desktop). This was brought home to me the other day when a completely non IT friend showed me his Motorola Xoom and was extolling its usability etc.

Interesting times. Wonder where OSS will infiltrate next? I guess the answer is probably wherever it can disrupt the marketplace in a engaging way for the consumer, or with a commercial model that is compelling to business/IT decision makers.

Categories
Architecture and Strategy Open Source Technology Web Development

Preparing a web environment for Security Penetration Testing…

We’ve gone through quite a few security / penetration / web application tests at work (often as part of compliance with HMG SPF / InfoSec standards for UK Government projects) and thought it would be useful to list some of the steps you need to consider (hardening, configuring etc) to ensure your application has a reduced security exposure. I feel that you should view security testing as an opportunity to improve the quality of your work rather than see it as a box ticking exercise (ultimately the testing is about making your application more secure which can only be a good thing). Whilst a lost of our work is based on LAMP (Linux, Apache, MySQL, PHP) many of the concepts below apply regardless of the technology used.

Firewalls and Port Access

Firewalls and access to ports – one of the most obvious – but you need to consider whether the risk profile requires one or 2 levels of hardware firewall, or whether iptables is sufficient. Can you lock down the environment such that you only expose port 80 or 443 to wider internet and create a restricted IP address based white list for administration (eg SSH access)? On many of our Architectures we only expose the load balancer(s) and or proxy layer to the internet, everything else is not available at all to general IP addresses across the internet.

If you do have to have SSH open to all make sure that you install denyhosts (which helps to prevent SSH brute force attacks by adding persistant bad username/password attempts to /etc/hosts.deny – preventing access from the offending IP address)

Cross Site Scripting (XSS) and SQL Injection vectors

Check that your application does something sensible if someone attempts to put javascript into text input boxes. Check that putting in something like:

“><script>alert(‘If you see this in an alert box there is a XSS vector in your application’)</script> into a username box (for example) does. If it brings up an alert dialog you know you have a problem. See the  XSS Wikipedia page for more info.

Similarly for SQL – if you put in rogue SQL key words does it mess with the SQL that is run? Do something non- destructive (particularly if you are spot checking a live web site environment!) A good example I like to use is can I add parameters to a where clause to see data I shouldn’t be able to see.

Personally I prefer 2 levels of checks for SQL Injection and XSS type code in application input: – one at the application input layer (eg sanitising user input asap) and another at the database interface / wrapper layer to ensure nothing nasty can get sent to be stored or messed about with on the database tier.

Server Hardening / Configuring

Ensuring the server is setup and configured properly

Google for and check the hardening guide for the operating system for recommended steps.

Ensure that security updates are being applied on a regular basis.

Ensure that anti-virus software is installed (for the Linux Platform ClamAV is an option)

Review (and peer review if possible) the configuration files for the main services on this box – for LAMP this means a minimum of:

(You can run locate <name of config file> to check where it is located)

  • /etc/ssh/sshd_config
  • php.ini
  • httpd.conf / apache2.conf (depending on how the server is configured) and configuration files for virtual hosts / SSL configuration
  • my.cnf (or other database config)
  • Load Balancer config files (for Pound this is typically /etc/pound.cfg)

These checks are particularly important if you are having a white box review of your system (where you give the SSH login details to a security tester to check the configuration).

Pre test checks

Before you hand over the system to the Internet Security guys run some of the kinds of tools that they will be running yourself to see what is available. As a minimum run an NMAP command against your ip addresses:

nmap -A -vv [IP Address]

And see what ports (and information about the ports) is returned. Also check if NMAP can enumerate what Operating System and Versions of Web Server software is running (can you do anything to remove version numbers or product names?)

These days  I like to use Backtrack (a Linux Distribution design for security testing) for security checks. I am running it as a Virtual Machine from with my Windows 7 machine (http://g0tmi1k.blogspot.com/2010/01/tutorial-video-how-to-install-backtrack.html as a useful video for getting it set up).

I could probably write all day about security but hopefully this gives a feel for the key aspects. Would be interested to hear anyone’s tips or must dos for LAMP security.

Categories
Linux Open Source Technology Ubuntu

This blog is now running on new hardware & software!

Recently changed the server this blog runs on to a low power Dual Core Intel Atom in a smaller form factor case (mini ITX). In an attempt to reduce my environmental and electricity footprint. Took the opportunity to upgrade Ubuntu Server to 10.04 LTS which comes with MySQL 5.1 and WordPress is now 3.0.1 ( which was a very easy upgrade – one click from within the web based admin – well done WordPress team for that!).

The Dual Core Opteron box this blog used to run on will now only be powered up when I am experimenting with Server Operating systems (will be re-built as VMware ESX host).

Getting in some IT geekery before my life gets turned upside down!

Categories
Open Source PHP Technology Web Development

PHP Conference UK 2010 Notes and Thoughts

Here are some notes / interesting products/thoughts that were mentioned (apologies this is more of a set of notes for me than a proper blog post – if I get time I will refine this!)

Started the day on a conference call back to the office so had to miss the keynote which was a shame as it was by quite an eccentric guy who Microsoft have hired (as a UX Architect Evangelist) largely about keeping thing simple and usability from what I gathered of the end of the talk.

Day was very tough as a I had a late night catching up on various things to allow me to free up the Friday – its difficult sitting through talks when really tired!

Met with several former colleagues from my last company (and former colleagues from my current company) so was a bit of a blast from the past at times.

There appear to be a lot of development and interest around NoSQL / document based databases at the moment – definitely something to keep an eye on as it matures as a technology.
http://www.phpconference.co.uk/talks

RDBMS in the social networks age
by Lorenzo Alberton

Database Graph Structures via advanced features of SQL, using SQL-99 and SQL-2003 functionality that certainly MySQL doesn’t have any many other DBs won’t have the 2003 extensions. Obviously using this kind of advanced functionality will have an impact on Database server load.

This talk felt a bit like it was flying in the wind of most new thinking at the moment (although to be fair – this is partly what Lorenzo has now put on his website below) which is to keep your database tier minimally loaded as it’s the part that has most issues with vertical and horzontal scalability (keep most of the CPU load in the web app tier as its easier to add more nodes there).

Slides available at:
http://www.alberton.info/talks

Legacy Code Talk by Ibuildings
doxygen – code documentation for any language not just PHP

ctags.sourceforge.net

BOUML bouml.free.fr (reverse engineering capabilities)

phpcs – Codesniffer (part of PhpUnderControl)

Thoughts for tackling older PHP4 based projects and code bases – get them in Source Control, start to apply Continous Integration type approaches.

Suggestions made around
Full isolation (separate server)
Using wrapper classes
Possible code rewriting routes for legacy code:
Going from random mix of PHP business logic and HTML outputting to neater procedural based code
Procedural to OO
OO to full OO

CouchDB
Early sight of the possible future of web application data persistance and replication. Interesting that CouchDB makes uses of HTTP as the connecting protocol. Might be possible (but probably not desirable apart from specific cases) in the future to create web applications that are JS direct to CouchDB in certain cases?

http://couchdb.apache.org/

Web and mobile application monetisation models / Paypal X

Paypal appear to be launching a new platform / API

  • Adaptive Payments
  • Pay multiplerecipients at once
  • Partnership
  • Chained payments (e.g. commission based payments)

Bit disappointed by this one as it was about PayPal’s API (https://www.x.com) rather than strategies for monetisation which is what the title lead me to believe.

Web Services Best Practise
At the beginning lots of stuff about basic HTTP (eg HTTP headers, Verbs)that ever developer should know about.

Lorna (also from iBuildings) who gave this talk seems to have a bit of a sarcastic talking down to you type tone I found slightly annoying – maybe she gives training to newbies all the time or something. Or maybe I was just tired. She had some interesting things to say about Web Services design particularly towards the end of her talk. The talk was caveated as being a bit of “a rant” and it was exactly that in places – felt like she was having a go at everyone a lot of the time!

Beers at the end sponsored by Facebook were a nice touch though, although I only had time to grab a quick one whilst chatting to Mark Schaschke from iBuildings and a couple of guys from my previous company. Think next year I will sit this one out to allow more developers to attend as think they will get more value out of it.

Categories
MS Windows Vista 7, 8 etc Open Source Samba

Fix for Windows 7 offline files and Samba

Further to my blog posts involving vista (and the tweaks that can help make Vista/Windows 7 compatible with Samba) I came across a registry setting that needs to be changed to get offline files to work correctly:

“Set the following registry key on the Windows Vista client to prevent files from getting pulled down to the client again right after synchronizing changes to the server (due to Linux file systems having coarser timestamp resolution than Windows):

Create a DWORD value named RoundUpWriteTimeOnSync under the HKLM\Software\Microsoft\Windows\CurrentVersion\NetCache key (create the key if it does not exist) and set it to 1.” from the Storage Team at Microsoft’s Blog: http://blogs.technet.com/filecab/archive/2007/03/16/using-offline-files-with-samba-emc-servers-nas-devices.aspx