September 22, 2011
This is an article I have been stewing on for a while and having recently changed from a consultancy largely working on public sector IT projects back to a private sector IT department its given me several different view points.
I also recently attended the excellent Zapthink SOA and Cloud course in Amsterdam – so I am now a Licensed/Certified Zapthink Architect!
Time for a change?
In the continued difficult financial climate will organisations continue to have the appetite and budget to invest in large scale greenfield COTS (Commercial Off the Shelf) IT projects and licensing? e.g. Large scale commercial enterprise systems such as ERP? And what’s the next success for Open Source Software (OSS)?
Is the future a more incrementally / agile delivered open source, best of breed systems? Rather than big monolithic, generic packaged software that does everything ok but doesn’t excel at much if anything. And worst of all, often requiring the business to change its processes to fit the software. Of course the lines between commercial software and Open Source are becoming more blurry – with “Commercial Open Source” (in other words commercially backed and supported).
I am thinking here of solutions that are developed on Open Standards / common platforms (eg J2EE) using common / standards based middleware and the XML family of technologies to connect them together. Of course there is a risk that if you pick and choose lots of niche software that serves its job well then you can end up with a big mess of spaghetti integration and duplication. But that is where effective Architecture, standards and Governance comes in; to keep things on the right track and aligned with business priorities.
Certainly the agile (iterative) methodology seems to be taking hold in larger companies, although waterfall still seems to be favoured in government – due to the perception that it will result in a fixed cost. Unfortunately too often it doesn’t deliver successful results as its too rigid, ends up costing far more through cunning use by the vendor of change control and depending on the project the initial build can be as little as 10% of the total costs in any case.
What about the cloud? Isn’t that supposed to reduce costs…
I think many in the IT industry (well some vendors anyway) right now would argue that the answer to this is delivery via the cloud using a pay as you need it service based model (to get away from having to make the big upfront investment in hardware and licensing). I guess this is an option but I think most large businesses (who have the budgets for the larger IT projects) are looking at the cloud quite sceptically, waiting for it to mature beyond e-Commerce and online type applications and add the required security and reliability that is needed. Keeping things in their own data centre and exploiting virtualisation to optimise costs at the Infrastructure layer. Cloud as your Disaster Recovery (DR) / Data Archiving environment looks like one of the most compelling use case so far.
I am seeing some suggestions that organisations would like to adopt this approach in some areas (eg Integration). In fact one of the places I worked in the past built its own home grown ERP / eLearning platform on Open Source. In my current role we are looking at Open Source alternatives – particularly for Integration and Infrastructure glue.
Its interesting to see how the adoption of Open Source has matured – from just the Linux OS used for servers, Linux + Apache for static web moving towards LAMP and other Apache projects such as Tomcat etc even more so with “Web 2.0″. Data Integration / ETL is a big area for OSS – eg Talend, ActiveMQ, Glassfish. J2EE is a big success story too.
And of course now with Android OSS has finally come into contact with the casual end user (rather than the techies like me that run Linux on the desktop). This was brought home to me the other day when a completely non IT friend showed me his Motorola Xoom and was extolling its usability etc.
Interesting times. Wonder where OSS will infiltrate next? I guess the answer is probably wherever it can disrupt the marketplace in a engaging way for the consumer, or with a commercial model that is compelling to business/IT decision makers.
February 01, 2011
Filed Under (Architecture and Strategy) by Ollie Cronk on 01-02-2011
Since attending the Seminar by Gartner on Enterprise Architecture last year I have been focussing on formalising my IT Architecture skills (well when time allows!!). TOGAF (The Open Group Architecture Framework) 9 appears to be the way to go. You can think of it much the same way as PRINCE2 is to Project Management – its something that provides the core principles but it needs to be tailored to the organisational specifics.
Came across “TOGAF 9 in pictures” available on http://www.orbussoftware.com/downloads which is a really effective way of getting to grips with the core concepts.
Also (on the same site) found a stencil for ArchiMate; Archimate is a means of standardising the way that Enterprise Architecture is defined at a high level. Chapter 2 of the specification makes good reading – has a nice summary on why EA: http://www.opengroup.org/archimate/doc/ts_archimate/index.html
From my research IT Architecture (in particular Enterprise Architecture) still seems to be something that different people and organisations view differently – in particular role definitions / responsibilities seem to vary massively. I also fear that often the goal behind EA initiatives aren’t clear enough and some organisations just want to “tick the EA box” rather than get true value from it.
February 01, 2011
Filed Under (Technology) by Ollie Cronk on 01-02-2011
I am seriously impressed by Microsoft OneNote and the HP Tablet that I am now using. I realised I was creating quite a bit of waste paper from my notes – so have moved them electronically.
I have a workbook for work and one for personal tasks and sync the workbooks between my home and work folders which is really neat. I have been using this setup for several months (mostly using the keyboard although sometimes in meetings the stylus is a faster and easier way of capturing thoughts and diagrams).
Visio 2010 is pretty cool tool, I would have to say that Windows 7, Visio and OneNote are my top 3 Microsoft products right now!
Visio 2010 even has support for Inking and Multi-touch – http://blogs.msdn.com/b/visio/archive/2009/12/18/visio-2010-better-with-windows-7.aspx and the MS blog has some other pretty handy tips: http://blogs.msdn.com/b/visio/
December 17, 2010
Filed Under (Architecture and Strategy, Open Source, Technology, Web Development) by Ollie Cronk on 17-12-2010
We’ve gone through quite a few security / penetration / web application tests at work (often as part of compliance with HMG SPF / InfoSec standards for UK Government projects) and thought it would be useful to list some of the steps you need to consider (hardening, configuring etc) to ensure your application has a reduced security exposure. I feel that you should view security testing as an opportunity to improve the quality of your work rather than see it as a box ticking exercise (ultimately the testing is about making your application more secure which can only be a good thing). Whilst a lost of our work is based on LAMP (Linux, Apache, MySQL, PHP) many of the concepts below apply regardless of the technology used.
Firewalls and Port Access
Firewalls and access to ports – one of the most obvious – but you need to consider whether the risk profile requires one or 2 levels of hardware firewall, or whether iptables is sufficient. Can you lock down the environment such that you only expose port 80 or 443 to wider internet and create a restricted IP address based white list for administration (eg SSH access)? On many of our Architectures we only expose the load balancer(s) and or proxy layer to the internet, everything else is not available at all to general IP addresses across the internet.
If you do have to have SSH open to all make sure that you install denyhosts (which helps to prevent SSH brute force attacks by adding persistant bad username/password attempts to /etc/hosts.deny – preventing access from the offending IP address)
Cross Site Scripting (XSS) and SQL Injection vectors
“><script>alert(‘If you see this in an alert box there is a XSS vector in your application’)</script> into a username box (for example) does. If it brings up an alert dialog you know you have a problem. See the XSS Wikipedia page for more info.
Similarly for SQL – if you put in rogue SQL key words does it mess with the SQL that is run? Do something non- destructive (particularly if you are spot checking a live web site environment!) A good example I like to use is can I add parameters to a where clause to see data I shouldn’t be able to see.
Personally I prefer 2 levels of checks for SQL Injection and XSS type code in application input: – one at the application input layer (eg sanitising user input asap) and another at the database interface / wrapper layer to ensure nothing nasty can get sent to be stored or messed about with on the database tier.
Server Hardening / Configuring
Ensuring the server is setup and configured properly
Google for and check the hardening guide for the operating system for recommended steps.
Ensure that security updates are being applied on a regular basis.
Ensure that anti-virus software is installed (for the Linux Platform ClamAV is an option)
Review (and peer review if possible) the configuration files for the main services on this box – for LAMP this means a minimum of:
(You can run locate <name of config file> to check where it is located)
These checks are particularly important if you are having a white box review of your system (where you give the SSH login details to a security tester to check the configuration).
Pre test checks
Before you hand over the system to the Internet Security guys run some of the kinds of tools that they will be running yourself to see what is available. As a minimum run an NMAP command against your ip addresses:
nmap -A -vv [IP Address]
And see what ports (and information about the ports) is returned. Also check if NMAP can enumerate what Operating System and Versions of Web Server software is running (can you do anything to remove version numbers or product names?)
These days I like to use Backtrack (a Linux Distribution design for security testing) for security checks. I am running it as a Virtual Machine from with my Windows 7 machine (http://g0tmi1k.blogspot.com/2010/01/tutorial-video-how-to-install-backtrack.html as a useful video for getting it set up).
I could probably write all day about security but hopefully this gives a feel for the key aspects. Would be interested to hear anyone’s tips or must dos for LAMP security.
August 27, 2010
Continuing in my series on professional development – see the previous article on documentation here (ok so there has been a bit of a pause and I am stretching things to call this a series – I had intended to post this some time ago!). This post concentrates on the benefits of using an Issue / Task / Bug Tracking Tool…
Keeping track of development tasks and issues in a centralised system helps enormously. Living without task tracking for your issues is a lot like not having having source control for your code. A good task tracking system – such as Fogbugz or Countersoft Gemini helps keep track of what the team needs to do, allows issues to be delegated / reallocated to more appropriate team members and enables multiple lines of support (eg 1st line, 2nd line etc). It also allows transparency on tasks (allowing Jane who requested a new developer to check the issue tracker for progress rather than interupting the technical team) and (particularly for those of us that need to follow ISO9001 type standards) provides an Audit trail if used properly.
Of course its not just about standing up an issue tracking tool – you need to agree on things like
- what defines an High Severity issue over a medium severity one? What Service Level agreements do we have and how does the issue track tie to those (eg does selecting medium mean response within a day as opposed to high which requires a response within 1 hour for arguments sake).
- What is the process from issue inception through to resolution (does a new change request issue go to Bob -or better Bob’s role “Change Manager” – who allocates it to someone to estimate and changes the status to pending estimation).
- What level of documentation are you looking for in the comments associated with a case – is just referencing a source control commit enough (which is ok if your source control commits are verbose) or do you want a short explaination of what was done?
Clearly if done right this can allow your team to scale and stop your developers getting bogged down with admin (make sure there is someone overseeing the issue tracker). It can also make it easier to seperate support work from new big development work (the former you can give to junior colleagues to help them get up to speed with support from more senior ones – preventing senior guys/girls from getting bored with smaller stuff).
One other observation on this is that whilst you do need to be strict in order to implement these tools (eg ensuring that folks always use the tracker rather than continuing to email you all the time) you need to make sure they don’t become a barrier to communication between the technical team and its customers. One thing I like to do when involved in an operational issue is to cc the issue tracker in on an more detailed email explaining an issue – the customer gets a personalised response and the issue tracker captures the commentary (preventing time wasted by copying and pasting).
Would love to hear others thoughts on their use of issue tracking systems and the pros / cons.
August 26, 2010
Those of us in the IT profession (or Information Management as one colleage recently suggested as an alternative*) don’t do ourselves many favours when it comes to using complex terminology and also expecting business people to understand and embrace IT best practises…
Whilst adopting concepts/practises such as Enterprise Architecture (EA), Data Governance, Information Management (IM), Knowledge Management (KM) are all well and good, the sheer number of buzz phrases and concepts must be bewildering for most non techies. I will admit that sometimes I struggle with the difference for example with Master Data Management vs Master Reference Data without resorting to Google or Wikipedia.
Of course some will argue that is what the Architect or Analyst roles are all about – to match business requirements to IT solutions. But if we ever want colleagues or clients or stakeholders to truely embrace the concepts of Knowledge Management or Data Management / Governance we need to break down these barriers.
Its all too easy to get DM/IM/KM confused if its not the way you think. Generally / at a high level its accepted that Data can be converted into useful Information and that humans (eg employees) walk around with a lot of Knowledge that often needs to be managed (and shared) more effectively. But often we don’t take the time to even explain these concepts – we just jump into enterprise IT lingo and expect others to know what we are on about (or why it makes business sense). Sometimes colleagues can get confused by products such as Sharepoint and what they do – as they can think they are the solution to Knowledge Management – when actually they are just the product or underlying tool that can enable Knowledge Management – its embracing the core concepts of KM that is key.
If we are not careful we will go start to regress back to the bad old days of IT where the IT guy was locked in the cupboard as no one understood him…. Ok maybe thats going too far but you know what I mean!
Or maybe I am being unfair? After all every different business area I have worked in seems to have its own Acronyms (finance is a nightmare with IPOs, CFA, Swaps, Deratives etc etc etc) – is it now accepted practise to just Google terms you don’t understand and be proactive about learning these things? Unfortunately in my experience some people aren’t prepared to do that (unless its in their area of expertise) – and you just switch them off or loose them before you can sell them the juicy or beneficial part of the story.
* Information Management was selected as to not confuse people with “plain old Information Technology” – the physical desktop PCs, laptops etc and kit that every business needs. Information Management it was argued is different as it is the leveraging of IT capability (where IM people are part of the core business team) to improve the way Information is managed (or processes are operated) and used as an Asset rather than something just delegated to IT to “sort out”.
March 01, 2010
Here are some notes / interesting products/thoughts that were mentioned (apologies this is more of a set of notes for me than a proper blog post – if I get time I will refine this!)
Started the day on a conference call back to the office so had to miss the keynote which was a shame as it was by quite an eccentric guy who Microsoft have hired (as a UX Architect Evangelist) largely about keeping thing simple and usability from what I gathered of the end of the talk.
Day was very tough as a I had a late night catching up on various things to allow me to free up the Friday – its difficult sitting through talks when really tired!
Met with several former colleagues from my last company (and former colleagues from my current company) so was a bit of a blast from the past at times.
There appear to be a lot of development and interest around NoSQL / document based databases at the moment – definitely something to keep an eye on as it matures as a technology.
RDBMS in the social networks age
Database Graph Structures via advanced features of SQL, using SQL-99 and SQL-2003 functionality that certainly MySQL doesn’t have any many other DBs won’t have the 2003 extensions. Obviously using this kind of advanced functionality will have an impact on Database server load.
This talk felt a bit like it was flying in the wind of most new thinking at the moment (although to be fair – this is partly what Lorenzo has now put on his website below) which is to keep your database tier minimally loaded as it’s the part that has most issues with vertical and horzontal scalability (keep most of the CPU load in the web app tier as its easier to add more nodes there).
Slides available at:
Legacy Code Talk by Ibuildings
BOUML bouml.free.fr (reverse engineering capabilities)
phpcs – Codesniffer (part of PhpUnderControl)
Thoughts for tackling older PHP4 based projects and code bases – get them in Source Control, start to apply Continous Integration type approaches.
Suggestions made around
Web and mobile application monetisation models / Paypal X
Paypal appear to be launching a new platform / API
Bit disappointed by this one as it was about PayPal’s API (https://www.x.com) rather than strategies for monetisation which is what the title lead me to believe.
Web Services Best Practise
Lorna (also from iBuildings) who gave this talk seems to have a bit of a sarcastic talking down to you type tone I found slightly annoying – maybe she gives training to newbies all the time or something. Or maybe I was just tired. She had some interesting things to say about Web Services design particularly towards the end of her talk. The talk was caveated as being a bit of “a rant” and it was exactly that in places – felt like she was having a go at everyone a lot of the time!
Beers at the end sponsored by Facebook were a nice touch though, although I only had time to grab a quick one whilst chatting to Mark Schaschke from iBuildings and a couple of guys from my previous company. Think next year I will sit this one out to allow more developers to attend as think they will get more value out of it.
July 24, 2009
Some Recent Tech Discoveries I thought I’d share:
Windows 7 RC – writing the blog post from it – excellent OS (and that says a lot coming from me!)
Spotify – sure lots of people know about this one now but great streaming music service. Kind of like a commercial radio station where you get to choose the playlist. But native version for Linux would be nice (netbooks will make this kind of porting happen organically now I suspect??)
ebox – Not a good move to just try and install this on a Ubuntu box (tried this at home) screwed lots of stuff up. Nice idea but if you want to try it out use a seperate box. It looks good and the concept is a great idea but I think its a bit too flawed for me right now (sorry ebox devs).
Denyhosts (prevents brute force attacks on SSH by adding IP addresses that repeatedly fail to login to a black list – in /etc/hosts.deny) silently stopped working some time ago on my Ubuntu server (due to an upgrade of Python by the looks of things). Following the fix on this forum thread sorted the problem although I found the file you need to change is: /usr/share/denyhosts/daemon-control-dist rather than the one mentioned.
HMG Info Sec standards (or rather the OTT implementation of) - I probably can’t say any more or I’ll get burned in acid (its a long and painful story…!)
More posts to come. Enjoy the summer everyone. I intend to on a ride around Litchfield tomorrow – embedded Google Map to follow no doubt…!
March 15, 2009
2009 has been pretty busy so far – as you can see from the lack of activity on here! Its almost 3 months in ’09 and this is my first post!
For those of you interested in what I’ve been up to…
At work – we are in our busy end of FY period, I’ve been going to Brussels quite a bit for a project for the European Commission. Experienced a first – a meeting/workshop with (almost all) the EU Member state countries which was really interesting.
Lots of other exciting projects kicking off at the moment – looks like I will continue to be busy well into the next FY, which is good. Fingers crossed the economic downturn isn’t used as an excuse/ is a cause for cutting back on environmental and climate change projects.
On a personal level (also linked to work) Kat and I signed up to participate in trials for a new type of improved home thermostat (‘chrono-proportional’) for the EST (its a project that work are running). It involves putting a few temperature sensors up indoors (and one outside) and we submit weekly gas/elec meter readings. These will monitor the effectiveness of our current thermostat – then later in the year we will have a new type of thermostat fitted and we’ll do the monitoring again.
The improved weather means I’m out on the bike alot more now which is good news, have also combined the riding with my new toy (which I got for Christmas) a Digital SLR camera – I’ll probably post up some photos soon.
Recent/Interesting Web Links:
Came across www.goodguide.com which is a product comparision site which lists how ethically produced,/green/healthy products are. Bit US specific at the moment but worth keeping an eye on.
An article on The Guardian’s (Free) Open Content API and DataStore. In fact the whole http://infosthetics.com website is worth a look at.
I’ve started playing around a bit with Twitter – have to say I was (and still am a bit) skeptical of this – so we’ll see how long I use it for. So far I have linked it with my Facebook status messages. Perhaps there’s a way of linking WordPress to it so it can announce new posts?
Thats it for now – hopefully it won’t be as long until my next post!
December 16, 2008
Some sites that I found interesting recently…
I’ve been looking at quite a lot of Green IT/Business type issues recently www.climatechangecorp.com is a good site (even if it does heavily promote its events on the content).
Carbon Calculators article: http://www.climatechangecorp.com/content.asp?ContentID=5119 Its disappointing it mentions AEA only in the context of providing the emissions factors – we actually do a whole lot more including building online carbon/emissions data platforms (and calculators) ourselves.
Sun Microsystems came into to talk to us about their Java CAPS recently and it looks really interesting. We’ve already started to look at NetBeans IDE more closely (particularly the PHP plugin and its SOA/Business Process (BPEL) diagram/visualisation to BPEL XML code features…
Dashboards, Flex with PHP etc:
One of my current projects is update our (data) visualisation and dashboard piece of our offering – as such we’ve been researching whats out there that we could use to enhance our technology stack:
Interestingly Microsoft seem to be about to launch a new Dynamics AX product which is focused at Environmental data management – interesting to see the main IT vendors start to move into the space that we’ve been in for some time. This is really good news as it will give more deployment and integration options moving forward (eg for customers already using Dynamics).