Archive for the ‘Technology’ Category
January 06, 2013
Filed Under (Baby / Parenting Related, Cycling, Life, MS Windows Vista 7, 8 etc, Technology) by Ollie Cronk on 06-01-2013
Been ages since I last posted and suspect it will be while before I post again – new job, kids and the fact that twitter is easier!
Have added a Strava widget to the right as have now seriously got into road cycling – have signed up for a few sportives this year including a 127 miler!
Cronk family is now bigger with the addition of our son Alfie.
Just about to start a new job in a new industry sector which is exciting.
Quick Tech good bad and ugly:
Good – Yahama RVX473 amp – amazing bit of Hi-Fi / Home Cinema kit, particularly paired with Boston Accoustics speakers. Very much enjoying this – in particularly the Airplay feature and the Smartphone and tablet apps which let you remote control it over Wifi!
Bad – Not much to report on this front. Still remain a bit unconvinced by Windows 8 – think it makes most sense if you have a touchscreen / tablet type device (HP Envy looks interesting).
Ugly – Am I the only one that thinks the front of the BMW 1 series looks horrible?
January 03, 2012
Filed Under (Technology) by Ollie Cronk on 03-01-2012
Its been a while since I’ve posted one of my Tech discoveries so this will cover quite a bit…
Originally wrote this back in April 2011! Being a dad doesn’t allow much time for blogging! In fact I am tempted to shut down this blog (given my usage of twitter, linkedin and FB means it gets less of a look in these days) – or move it to the cloud…
Sony Vaio SA Core i7 laptop – will post a separate review in due course, but this is a really nice machine for Windows 7 (and running a couple of other OS via VirtualBox!) upped mine to 8GB RAM – amazingly quick, small, light and very good battery life on stamina (only downside – can be a but noisy / hot in speed mode under load).
Flat Ethernet cables – awesome – see my other post about home AV setup – but they are great for running under carpet, laminate through closed (and locked) window openings!).
Google Chrome Browser – very fast (makes even Firefox feel sluggish, and IE is distinctly snail like in comparision), robust and now it has plugins its great – my main browser at home.
Blackberry Bold – call me a luddite (and behind the times given the recent down with RIM news that is all over the media) but I like a good straightforward work phone, no touchscreen just a plain old qwerty keypad for quickly typing out emails and texts and amazing battery life. Oh ok so yes I wouldn’t say no if work offered me an Iphone instead…!
Amazon selling laptop batteries for £20 – with the SSD drive and upgrade to Win7 my 4 year old Vaio is running really well (update – well it was! Its now been replaced with an SA Series Vaio – now gets used when the daughter is around and don’t want to risk the new one getting attacked!).
HP Elitebook laptops – have had a Tablet and a 14″ laptop and both have been excellent. I will be disappointed if HP do drop their PC line – they do some good (if perhaps a little bulky by today’s standards) kit.
Going back to XP and Office 2003 at work, although I have now managed to get up to 2007 which is a relief! Windows 7 should come later in the year fingers crossed!
SSD Hybrid hard disk drive – good idea in principle but needs to mature a bit (friend had one fail on him with medium term use, might have just been a dodgy one though).
November 09, 2011
Quick post – caveat haven’t had a chance to proof read this one and its late so it will have to do for now!
You may not be aware that you can use a Playstation 3 to act as a media streaming/playback client using a system called DNLA (also known as uPNP). This allows you to view content on your computer on your main TV in HD. Windows Media Player can act as the “Server” portion but its not ideal for connecting to the Playstation.
Crude diagram here, might expand this with my full setup when I get a chance:
I have been trying to get this working for a while. Essentially the plan is to get access to downloaded videos, videos from my camcorder (now HD) to save burning it to DVD or Blu-Ray and also access my photos and music collection from my Ubuntu Linux server that holds all my content (on a RAID 1 mirrored disk setup) to my TV and home cinema/HiFi setup.
Last time I tried to use a small command line utility and my PS3 was only connected via Wireless to the Server – the result was stuttering music let alone videos. So its something I gave up on for the time being.
Recently I have been able to overcome this as I have discovered flat gigabit ethernet cables that I can run out of my double glazed windows (even when shut!) so I have hacked a gigabit backbone that connects my TV and AV kit (including PS3) to my Linux Server (in fact the very one that served this blog page to you) that hold gigabytes of multimedia (now there’s a word you don’t hear much these days!)
Also discovered http://code.google.com/p/ps3mediaserver/ which is a great Java based server component for PNP based streaming – as the name suggests its specifically designed for connecting the PS3 up to content…
Hey presto excellent quality video (including 1080p video) and music on the TV / HiFi!
Next to work out how to get my iTunes (stuff that only plays on iTunes rather than MP3s) music across and available to the PS3. I have moved and shared the my iTunes media folder (as have that on the network too – as per these instructions – http://lifehacker.com/230605/hack-attack-share-your-itunes-music-library-over-your-home-network - so I can re-use iTunes across different machines – and keep it backed up).
September 22, 2011
This is an article I have been stewing on for a while and having recently changed from a consultancy largely working on public sector IT projects back to a private sector IT department its given me several different view points.
I also recently attended the excellent Zapthink SOA and Cloud course in Amsterdam – so I am now a Licensed/Certified Zapthink Architect!
Time for a change?
In the continued difficult financial climate will organisations continue to have the appetite and budget to invest in large scale greenfield COTS (Commercial Off the Shelf) IT projects and licensing? e.g. Large scale commercial enterprise systems such as ERP? And what’s the next success for Open Source Software (OSS)?
Is the future a more incrementally / agile delivered open source, best of breed systems? Rather than big monolithic, generic packaged software that does everything ok but doesn’t excel at much if anything. And worst of all, often requiring the business to change its processes to fit the software. Of course the lines between commercial software and Open Source are becoming more blurry – with “Commercial Open Source” (in other words commercially backed and supported).
I am thinking here of solutions that are developed on Open Standards / common platforms (eg J2EE) using common / standards based middleware and the XML family of technologies to connect them together. Of course there is a risk that if you pick and choose lots of niche software that serves its job well then you can end up with a big mess of spaghetti integration and duplication. But that is where effective Architecture, standards and Governance comes in; to keep things on the right track and aligned with business priorities.
Certainly the agile (iterative) methodology seems to be taking hold in larger companies, although waterfall still seems to be favoured in government – due to the perception that it will result in a fixed cost. Unfortunately too often it doesn’t deliver successful results as its too rigid, ends up costing far more through cunning use by the vendor of change control and depending on the project the initial build can be as little as 10% of the total costs in any case.
What about the cloud? Isn’t that supposed to reduce costs…
I think many in the IT industry (well some vendors anyway) right now would argue that the answer to this is delivery via the cloud using a pay as you need it service based model (to get away from having to make the big upfront investment in hardware and licensing). I guess this is an option but I think most large businesses (who have the budgets for the larger IT projects) are looking at the cloud quite sceptically, waiting for it to mature beyond e-Commerce and online type applications and add the required security and reliability that is needed. Keeping things in their own data centre and exploiting virtualisation to optimise costs at the Infrastructure layer. Cloud as your Disaster Recovery (DR) / Data Archiving environment looks like one of the most compelling use case so far.
I am seeing some suggestions that organisations would like to adopt this approach in some areas (eg Integration). In fact one of the places I worked in the past built its own home grown ERP / eLearning platform on Open Source. In my current role we are looking at Open Source alternatives – particularly for Integration and Infrastructure glue.
Its interesting to see how the adoption of Open Source has matured – from just the Linux OS used for servers, Linux + Apache for static web moving towards LAMP and other Apache projects such as Tomcat etc even more so with “Web 2.0″. Data Integration / ETL is a big area for OSS – eg Talend, ActiveMQ, Glassfish. J2EE is a big success story too.
And of course now with Android OSS has finally come into contact with the casual end user (rather than the techies like me that run Linux on the desktop). This was brought home to me the other day when a completely non IT friend showed me his Motorola Xoom and was extolling its usability etc.
Interesting times. Wonder where OSS will infiltrate next? I guess the answer is probably wherever it can disrupt the marketplace in a engaging way for the consumer, or with a commercial model that is compelling to business/IT decision makers.
February 01, 2011
Filed Under (Architecture and Strategy) by Ollie Cronk on 01-02-2011
Since attending the Seminar by Gartner on Enterprise Architecture last year I have been focussing on formalising my IT Architecture skills (well when time allows!!). TOGAF (The Open Group Architecture Framework) 9 appears to be the way to go. You can think of it much the same way as PRINCE2 is to Project Management – its something that provides the core principles but it needs to be tailored to the organisational specifics.
Came across “TOGAF 9 in pictures” available on http://www.orbussoftware.com/downloads which is a really effective way of getting to grips with the core concepts.
Also (on the same site) found a stencil for ArchiMate; Archimate is a means of standardising the way that Enterprise Architecture is defined at a high level. Chapter 2 of the specification makes good reading – has a nice summary on why EA: http://www.opengroup.org/archimate/doc/ts_archimate/index.html
From my research IT Architecture (in particular Enterprise Architecture) still seems to be something that different people and organisations view differently – in particular role definitions / responsibilities seem to vary massively. I also fear that often the goal behind EA initiatives aren’t clear enough and some organisations just want to “tick the EA box” rather than get true value from it.
February 01, 2011
Filed Under (Technology) by Ollie Cronk on 01-02-2011
I am seriously impressed by Microsoft OneNote and the HP Tablet that I am now using. I realised I was creating quite a bit of waste paper from my notes – so have moved them electronically.
I have a workbook for work and one for personal tasks and sync the workbooks between my home and work folders which is really neat. I have been using this setup for several months (mostly using the keyboard although sometimes in meetings the stylus is a faster and easier way of capturing thoughts and diagrams).
Visio 2010 is pretty cool tool, I would have to say that Windows 7, Visio and OneNote are my top 3 Microsoft products right now!
Visio 2010 even has support for Inking and Multi-touch – http://blogs.msdn.com/b/visio/archive/2009/12/18/visio-2010-better-with-windows-7.aspx and the MS blog has some other pretty handy tips: http://blogs.msdn.com/b/visio/
December 17, 2010
Filed Under (Architecture and Strategy, Open Source, Technology, Web Development) by Ollie Cronk on 17-12-2010
We’ve gone through quite a few security / penetration / web application tests at work (often as part of compliance with HMG SPF / InfoSec standards for UK Government projects) and thought it would be useful to list some of the steps you need to consider (hardening, configuring etc) to ensure your application has a reduced security exposure. I feel that you should view security testing as an opportunity to improve the quality of your work rather than see it as a box ticking exercise (ultimately the testing is about making your application more secure which can only be a good thing). Whilst a lost of our work is based on LAMP (Linux, Apache, MySQL, PHP) many of the concepts below apply regardless of the technology used.
Firewalls and Port Access
Firewalls and access to ports – one of the most obvious – but you need to consider whether the risk profile requires one or 2 levels of hardware firewall, or whether iptables is sufficient. Can you lock down the environment such that you only expose port 80 or 443 to wider internet and create a restricted IP address based white list for administration (eg SSH access)? On many of our Architectures we only expose the load balancer(s) and or proxy layer to the internet, everything else is not available at all to general IP addresses across the internet.
If you do have to have SSH open to all make sure that you install denyhosts (which helps to prevent SSH brute force attacks by adding persistant bad username/password attempts to /etc/hosts.deny – preventing access from the offending IP address)
Cross Site Scripting (XSS) and SQL Injection vectors
“><script>alert(‘If you see this in an alert box there is a XSS vector in your application’)</script> into a username box (for example) does. If it brings up an alert dialog you know you have a problem. See the XSS Wikipedia page for more info.
Similarly for SQL – if you put in rogue SQL key words does it mess with the SQL that is run? Do something non- destructive (particularly if you are spot checking a live web site environment!) A good example I like to use is can I add parameters to a where clause to see data I shouldn’t be able to see.
Personally I prefer 2 levels of checks for SQL Injection and XSS type code in application input: – one at the application input layer (eg sanitising user input asap) and another at the database interface / wrapper layer to ensure nothing nasty can get sent to be stored or messed about with on the database tier.
Server Hardening / Configuring
Ensuring the server is setup and configured properly
Google for and check the hardening guide for the operating system for recommended steps.
Ensure that security updates are being applied on a regular basis.
Ensure that anti-virus software is installed (for the Linux Platform ClamAV is an option)
Review (and peer review if possible) the configuration files for the main services on this box – for LAMP this means a minimum of:
(You can run locate <name of config file> to check where it is located)
These checks are particularly important if you are having a white box review of your system (where you give the SSH login details to a security tester to check the configuration).
Pre test checks
Before you hand over the system to the Internet Security guys run some of the kinds of tools that they will be running yourself to see what is available. As a minimum run an NMAP command against your ip addresses:
nmap -A -vv [IP Address]
And see what ports (and information about the ports) is returned. Also check if NMAP can enumerate what Operating System and Versions of Web Server software is running (can you do anything to remove version numbers or product names?)
These days I like to use Backtrack (a Linux Distribution design for security testing) for security checks. I am running it as a Virtual Machine from with my Windows 7 machine (http://g0tmi1k.blogspot.com/2010/01/tutorial-video-how-to-install-backtrack.html as a useful video for getting it set up).
I could probably write all day about security but hopefully this gives a feel for the key aspects. Would be interested to hear anyone’s tips or must dos for LAMP security.
October 23, 2010
Recently changed the server this blog runs on to a low power Dual Core Intel Atom in a smaller form factor case (mini ITX). In an attempt to reduce my environmental and electricity footprint. Took the opportunity to upgrade Ubuntu Server to 10.04 LTS which comes with MySQL 5.1 and WordPress is now 3.0.1 ( which was a very easy upgrade – one click from within the web based admin – well done WordPress team for that!).
The Dual Core Opteron box this blog used to run on will now only be powered up when I am experimenting with Server Operating systems (will be re-built as VMware ESX host).
Getting in some IT geekery before my life gets turned upside down!
August 27, 2010
Continuing in my series on professional development – see the previous article on documentation here (ok so there has been a bit of a pause and I am stretching things to call this a series – I had intended to post this some time ago!). This post concentrates on the benefits of using an Issue / Task / Bug Tracking Tool…
Keeping track of development tasks and issues in a centralised system helps enormously. Living without task tracking for your issues is a lot like not having having source control for your code. A good task tracking system – such as Fogbugz or Countersoft Gemini helps keep track of what the team needs to do, allows issues to be delegated / reallocated to more appropriate team members and enables multiple lines of support (eg 1st line, 2nd line etc). It also allows transparency on tasks (allowing Jane who requested a new developer to check the issue tracker for progress rather than interupting the technical team) and (particularly for those of us that need to follow ISO9001 type standards) provides an Audit trail if used properly.
Of course its not just about standing up an issue tracking tool – you need to agree on things like
- what defines an High Severity issue over a medium severity one? What Service Level agreements do we have and how does the issue track tie to those (eg does selecting medium mean response within a day as opposed to high which requires a response within 1 hour for arguments sake).
- What is the process from issue inception through to resolution (does a new change request issue go to Bob -or better Bob’s role “Change Manager” – who allocates it to someone to estimate and changes the status to pending estimation).
- What level of documentation are you looking for in the comments associated with a case – is just referencing a source control commit enough (which is ok if your source control commits are verbose) or do you want a short explaination of what was done?
Clearly if done right this can allow your team to scale and stop your developers getting bogged down with admin (make sure there is someone overseeing the issue tracker). It can also make it easier to seperate support work from new big development work (the former you can give to junior colleagues to help them get up to speed with support from more senior ones – preventing senior guys/girls from getting bored with smaller stuff).
One other observation on this is that whilst you do need to be strict in order to implement these tools (eg ensuring that folks always use the tracker rather than continuing to email you all the time) you need to make sure they don’t become a barrier to communication between the technical team and its customers. One thing I like to do when involved in an operational issue is to cc the issue tracker in on an more detailed email explaining an issue – the customer gets a personalised response and the issue tracker captures the commentary (preventing time wasted by copying and pasting).
Would love to hear others thoughts on their use of issue tracking systems and the pros / cons.
August 26, 2010
Those of us in the IT profession (or Information Management as one colleage recently suggested as an alternative*) don’t do ourselves many favours when it comes to using complex terminology and also expecting business people to understand and embrace IT best practises…
Whilst adopting concepts/practises such as Enterprise Architecture (EA), Data Governance, Information Management (IM), Knowledge Management (KM) are all well and good, the sheer number of buzz phrases and concepts must be bewildering for most non techies. I will admit that sometimes I struggle with the difference for example with Master Data Management vs Master Reference Data without resorting to Google or Wikipedia.
Of course some will argue that is what the Architect or Analyst roles are all about – to match business requirements to IT solutions. But if we ever want colleagues or clients or stakeholders to truely embrace the concepts of Knowledge Management or Data Management / Governance we need to break down these barriers.
Its all too easy to get DM/IM/KM confused if its not the way you think. Generally / at a high level its accepted that Data can be converted into useful Information and that humans (eg employees) walk around with a lot of Knowledge that often needs to be managed (and shared) more effectively. But often we don’t take the time to even explain these concepts – we just jump into enterprise IT lingo and expect others to know what we are on about (or why it makes business sense). Sometimes colleagues can get confused by products such as Sharepoint and what they do – as they can think they are the solution to Knowledge Management – when actually they are just the product or underlying tool that can enable Knowledge Management – its embracing the core concepts of KM that is key.
If we are not careful we will go start to regress back to the bad old days of IT where the IT guy was locked in the cupboard as no one understood him…. Ok maybe thats going too far but you know what I mean!
Or maybe I am being unfair? After all every different business area I have worked in seems to have its own Acronyms (finance is a nightmare with IPOs, CFA, Swaps, Deratives etc etc etc) – is it now accepted practise to just Google terms you don’t understand and be proactive about learning these things? Unfortunately in my experience some people aren’t prepared to do that (unless its in their area of expertise) – and you just switch them off or loose them before you can sell them the juicy or beneficial part of the story.
* Information Management was selected as to not confuse people with “plain old Information Technology” – the physical desktop PCs, laptops etc and kit that every business needs. Information Management it was argued is different as it is the leveraging of IT capability (where IM people are part of the core business team) to improve the way Information is managed (or processes are operated) and used as an Asset rather than something just delegated to IT to “sort out”.