February 07, 2012
Filed Under (Linux, MS Windows Vista 7, 8 etc, Open Source, Technology) by Ollie Cronk on 07-02-2012
I think this is a silly move – surely one of the things that keep people tied to Windows is the fact that they know how to use it. If they (badly) copy Mac / Linux and force people to re-learn how to navigate the OS won’t more people just switch to Mac/iPad and Linux? Especially given Android’s recent successes, and the continuing Apple obsession?
Just need Google to ditch the cloud obsession from their Chromebook / Chrome OS or create an Android for PCs to accelerate it…
I hope for MS sakes they keep an option in to make the OS look like Windows 7 – eg a basic theme?
November 09, 2011
Quick post – caveat haven’t had a chance to proof read this one and its late so it will have to do for now!
You may not be aware that you can use a Playstation 3 to act as a media streaming/playback client using a system called DNLA (also known as uPNP). This allows you to view content on your computer on your main TV in HD. Windows Media Player can act as the “Server” portion but its not ideal for connecting to the Playstation.
Crude diagram here, might expand this with my full setup when I get a chance:
I have been trying to get this working for a while. Essentially the plan is to get access to downloaded videos, videos from my camcorder (now HD) to save burning it to DVD or Blu-Ray and also access my photos and music collection from my Ubuntu Linux server that holds all my content (on a RAID 1 mirrored disk setup) to my TV and home cinema/HiFi setup.
Last time I tried to use a small command line utility and my PS3 was only connected via Wireless to the Server – the result was stuttering music let alone videos. So its something I gave up on for the time being.
Recently I have been able to overcome this as I have discovered flat gigabit ethernet cables that I can run out of my double glazed windows (even when shut!) so I have hacked a gigabit backbone that connects my TV and AV kit (including PS3) to my Linux Server (in fact the very one that served this blog page to you) that hold gigabytes of multimedia (now there’s a word you don’t hear much these days!)
Also discovered http://code.google.com/p/ps3mediaserver/ which is a great Java based server component for PNP based streaming – as the name suggests its specifically designed for connecting the PS3 up to content…
Hey presto excellent quality video (including 1080p video) and music on the TV / HiFi!
Next to work out how to get my iTunes (stuff that only plays on iTunes rather than MP3s) music across and available to the PS3. I have moved and shared the my iTunes media folder (as have that on the network too – as per these instructions – http://lifehacker.com/230605/hack-attack-share-your-itunes-music-library-over-your-home-network – so I can re-use iTunes across different machines – and keep it backed up).
September 22, 2011
This is an article I have been stewing on for a while and having recently changed from a consultancy largely working on public sector IT projects back to a private sector IT department its given me several different view points.
I also recently attended the excellent Zapthink SOA and Cloud course in Amsterdam – so I am now a Licensed/Certified Zapthink Architect!
Time for a change?
In the continued difficult financial climate will organisations continue to have the appetite and budget to invest in large scale greenfield COTS (Commercial Off the Shelf) IT projects and licensing? e.g. Large scale commercial enterprise systems such as ERP? And what’s the next success for Open Source Software (OSS)?
Is the future a more incrementally / agile delivered open source, best of breed systems? Rather than big monolithic, generic packaged software that does everything ok but doesn’t excel at much if anything. And worst of all, often requiring the business to change its processes to fit the software. Of course the lines between commercial software and Open Source are becoming more blurry – with “Commercial Open Source” (in other words commercially backed and supported).
I am thinking here of solutions that are developed on Open Standards / common platforms (eg J2EE) using common / standards based middleware and the XML family of technologies to connect them together. Of course there is a risk that if you pick and choose lots of niche software that serves its job well then you can end up with a big mess of spaghetti integration and duplication. But that is where effective Architecture, standards and Governance comes in; to keep things on the right track and aligned with business priorities.
Certainly the agile (iterative) methodology seems to be taking hold in larger companies, although waterfall still seems to be favoured in government – due to the perception that it will result in a fixed cost. Unfortunately too often it doesn’t deliver successful results as its too rigid, ends up costing far more through cunning use by the vendor of change control and depending on the project the initial build can be as little as 10% of the total costs in any case.
What about the cloud? Isn’t that supposed to reduce costs…
I think many in the IT industry (well some vendors anyway) right now would argue that the answer to this is delivery via the cloud using a pay as you need it service based model (to get away from having to make the big upfront investment in hardware and licensing). I guess this is an option but I think most large businesses (who have the budgets for the larger IT projects) are looking at the cloud quite sceptically, waiting for it to mature beyond e-Commerce and online type applications and add the required security and reliability that is needed. Keeping things in their own data centre and exploiting virtualisation to optimise costs at the Infrastructure layer. Cloud as your Disaster Recovery (DR) / Data Archiving environment looks like one of the most compelling use case so far.
I am seeing some suggestions that organisations would like to adopt this approach in some areas (eg Integration). In fact one of the places I worked in the past built its own home grown ERP / eLearning platform on Open Source. In my current role we are looking at Open Source alternatives – particularly for Integration and Infrastructure glue.
Its interesting to see how the adoption of Open Source has matured – from just the Linux OS used for servers, Linux + Apache for static web moving towards LAMP and other Apache projects such as Tomcat etc even more so with “Web 2.0”. Data Integration / ETL is a big area for OSS – eg Talend, ActiveMQ, Glassfish. J2EE is a big success story too.
And of course now with Android OSS has finally come into contact with the casual end user (rather than the techies like me that run Linux on the desktop). This was brought home to me the other day when a completely non IT friend showed me his Motorola Xoom and was extolling its usability etc.
Interesting times. Wonder where OSS will infiltrate next? I guess the answer is probably wherever it can disrupt the marketplace in a engaging way for the consumer, or with a commercial model that is compelling to business/IT decision makers.
December 17, 2010
Filed Under (Architecture and Strategy, Open Source, Technology, Web Development) by Ollie Cronk on 17-12-2010
We’ve gone through quite a few security / penetration / web application tests at work (often as part of compliance with HMG SPF / InfoSec standards for UK Government projects) and thought it would be useful to list some of the steps you need to consider (hardening, configuring etc) to ensure your application has a reduced security exposure. I feel that you should view security testing as an opportunity to improve the quality of your work rather than see it as a box ticking exercise (ultimately the testing is about making your application more secure which can only be a good thing). Whilst a lost of our work is based on LAMP (Linux, Apache, MySQL, PHP) many of the concepts below apply regardless of the technology used.
Firewalls and Port Access
Firewalls and access to ports – one of the most obvious – but you need to consider whether the risk profile requires one or 2 levels of hardware firewall, or whether iptables is sufficient. Can you lock down the environment such that you only expose port 80 or 443 to wider internet and create a restricted IP address based white list for administration (eg SSH access)? On many of our Architectures we only expose the load balancer(s) and or proxy layer to the internet, everything else is not available at all to general IP addresses across the internet.
If you do have to have SSH open to all make sure that you install denyhosts (which helps to prevent SSH brute force attacks by adding persistant bad username/password attempts to /etc/hosts.deny – preventing access from the offending IP address)
Cross Site Scripting (XSS) and SQL Injection vectors
“><script>alert(‘If you see this in an alert box there is a XSS vector in your application’)</script> into a username box (for example) does. If it brings up an alert dialog you know you have a problem. See the XSS Wikipedia page for more info.
Similarly for SQL – if you put in rogue SQL key words does it mess with the SQL that is run? Do something non- destructive (particularly if you are spot checking a live web site environment!) A good example I like to use is can I add parameters to a where clause to see data I shouldn’t be able to see.
Personally I prefer 2 levels of checks for SQL Injection and XSS type code in application input: – one at the application input layer (eg sanitising user input asap) and another at the database interface / wrapper layer to ensure nothing nasty can get sent to be stored or messed about with on the database tier.
Server Hardening / Configuring
Ensuring the server is setup and configured properly
Google for and check the hardening guide for the operating system for recommended steps.
Ensure that security updates are being applied on a regular basis.
Ensure that anti-virus software is installed (for the Linux Platform ClamAV is an option)
Review (and peer review if possible) the configuration files for the main services on this box – for LAMP this means a minimum of:
(You can run locate <name of config file> to check where it is located)
These checks are particularly important if you are having a white box review of your system (where you give the SSH login details to a security tester to check the configuration).
Pre test checks
Before you hand over the system to the Internet Security guys run some of the kinds of tools that they will be running yourself to see what is available. As a minimum run an NMAP command against your ip addresses:
nmap -A -vv [IP Address]
And see what ports (and information about the ports) is returned. Also check if NMAP can enumerate what Operating System and Versions of Web Server software is running (can you do anything to remove version numbers or product names?)
These days I like to use Backtrack (a Linux Distribution design for security testing) for security checks. I am running it as a Virtual Machine from with my Windows 7 machine (http://g0tmi1k.blogspot.com/2010/01/tutorial-video-how-to-install-backtrack.html as a useful video for getting it set up).
I could probably write all day about security but hopefully this gives a feel for the key aspects. Would be interested to hear anyone’s tips or must dos for LAMP security.
October 23, 2010
Recently changed the server this blog runs on to a low power Dual Core Intel Atom in a smaller form factor case (mini ITX). In an attempt to reduce my environmental and electricity footprint. Took the opportunity to upgrade Ubuntu Server to 10.04 LTS which comes with MySQL 5.1 and WordPress is now 3.0.1 ( which was a very easy upgrade – one click from within the web based admin – well done WordPress team for that!).
The Dual Core Opteron box this blog used to run on will now only be powered up when I am experimenting with Server Operating systems (will be re-built as VMware ESX host).
Getting in some IT geekery before my life gets turned upside down!
November 09, 2009
Further to my blog posts involving vista (and the tweaks that can help make Vista/Windows 7 compatible with Samba) I came across a registry setting that needs to be changed to get offline files to work correctly:
“Set the following registry key on the Windows Vista client to prevent files from getting pulled down to the client again right after synchronizing changes to the server (due to Linux file systems having coarser timestamp resolution than Windows):
Create a DWORD value named RoundUpWriteTimeOnSync under the HKLM\Software\Microsoft\Windows\CurrentVersion\NetCache key (create the key if it does not exist) and set it to 1.” from the Storage Team at Microsoft’s Blog: http://blogs.technet.com/filecab/archive/2007/03/16/using-offline-files-with-samba-emc-servers-nas-devices.aspx
July 24, 2009
Some Recent Tech Discoveries I thought I’d share:
Windows 7 RC – writing the blog post from it – excellent OS (and that says a lot coming from me!)
Spotify – sure lots of people know about this one now but great streaming music service. Kind of like a commercial radio station where you get to choose the playlist. But native version for Linux would be nice (netbooks will make this kind of porting happen organically now I suspect??)
ebox – Not a good move to just try and install this on a Ubuntu box (tried this at home) screwed lots of stuff up. Nice idea but if you want to try it out use a seperate box. It looks good and the concept is a great idea but I think its a bit too flawed for me right now (sorry ebox devs).
Denyhosts (prevents brute force attacks on SSH by adding IP addresses that repeatedly fail to login to a black list – in /etc/hosts.deny) silently stopped working some time ago on my Ubuntu server (due to an upgrade of Python by the looks of things). Following the fix on this forum thread sorted the problem although I found the file you need to change is: /usr/share/denyhosts/daemon-control-dist rather than the one mentioned.
HMG Info Sec standards (or rather the OTT implementation of) – I probably can’t say any more or I’ll get burned in acid (its a long and painful story…!)
More posts to come. Enjoy the summer everyone. I intend to on a ride around Litchfield tomorrow – embedded Google Map to follow no doubt…!
May 17, 2009
One of my projects at the moment is to look at our options for building SMS enabled web applications (specifically for us around our Zend Framework based apps). Both for data capture (Inbound) and as an alerting / notification system (Outbound).
Thought I’d pull together some of my thoughts and reference material [not exhaustive or complete yet] in case its of use to anyone else in a similar situation. But first I’d like to thank my good friend Jem who helped identify some different angles on this…
LinkedIn Q&A is a great reference – here are a few relevant threads that I came across (you’ll probably need a Linkedin.com account to get to these) there are lots more if you search around with SMS related keywords.
There are 2 main options – and as always its the struggle between D.I.Y and DRY (Don’t Repeat Yourself – or my version DRY-OFF – DRY or others [for f sake? I just wanted it to be OFF as it sounded better; anyway I’ll shut up now!])
Roll your own
pros – complete control over messaging and ability to iron out any kinks in connectivity etc, potentially cheaper to run / only costs you what you use (rather than having to buy credits)
cons – more complex to setup in the first place, need to buy & setup some hardware somewhere etc
Pros – ease of getting it up and running if the integration API (eg HTTP, XML/E-mail based) is easy to pick up
Cons – my concern around these guys is how do you how good they are – will they disappear tomorrow? What gateways are they using, how reliable are their channels etc.
Guide to Gateways (US focused) but has some nice general considerations) http://www.developershome.com/sms/howToChooseSMSGateway.asp This site also has a really nice comparison table – which you could also use as a template for doing your own matrix/scoring comparisions of these services.
We will probably go with a combination of the 2 options – using our own system for the development of services (as we have greater control) and then making use of a partner once the message volumes go above what is finanically viable/scalable in house…
Once the technical bit is out of the way you then need to consider the usability and process flow around the app – eg if users are sending in data, queuing, acknowledging their submissions, correcting mistakes etc…
Hope to post more on this topic if I get the opportunity! If anyone has any insights or good resources on this topic then by all means please comment on this post!
April 27, 2009
Generally Windows runs at the same speed as it does normally – so long as you don’t run too much stuff on the Host OS at the same time – but of course there are limitation – eg Games or software that needs access to devices that can’t be provided via VirtualBox. And of course you could run the reverse setup – if you fancy trying Linux as a Guest OS but keeping the safety net of Windows as your main Operating System.
Anyway I’m posting this as I’ve been using VirtualBox on Ubuntu Linux to run Windows XP. On a recent upgrade from Ubuntu 8.10 to the latest version 9.04 VirtualBox failed to run. This was fixed by running the command given in the error message (its nice to get a very useful error message in software!)
The command I had to run was
Once VirtualBox was working again I noticed that the Host key (which is the key used in different combinations to switch between the Host and the Guest OS) was not working. Instead it was flashing the Ubuntu desktop and pulsed some circles – like radar – from the cursor. At first I didn’t twig that it was simply the new mouse settings in the latest version of Ubuntu. By default it now seems the tickbox below in System->Preferences->Mouse for “Show position of pointer when the Control key is pressed” :
Hey presto – the host key works again! Hopefully this is helpful for anyone else confused by this one!
Also note the position of the sliders in the above screenshot for acceleration and sensitivity – I find these settings make the touchpad on my Vaio behave in a similar way to Windows (previously my mouse felt too sluggish).
I’m now actually using Windows less and less now at home (Evolution is a decent email client and of course Firefox offer pretty much the same browsing experience – apart from some differences with fonts, and OpenOffice allows for opening the odd Office attachment). The true acid test though is how much the wife moans as previously she’s never been happy without the familarity of Windows…! (But then it is still available in a couple of clicks).
I’m sure there will be more on my adventures of using Ubuntu on the desktop in due course – if I find time I’ll share anything I think others might find useful…
December 15, 2008
They’ve introduced a turbo button which makes use of Google Gears (the offline/browser enchancement) and added more time saving shortcuts and a better post writing interface (particularly for re-using existing tags on new posts – this UI for tagging taxonomy might have to be borrowed on some of my projects…) Read their blog post here for more info on features (includes a video).
Below is the new dashboard – compare the below to WordPress 2.5 that I upgraded to back in May 08
The new dashboard is below – it now has dragable “modules”
Trouble is it now makes me want to upgrade the design of my blog – but I know I have way more important things to do with my time before I faff with that again!