Wednesday, March 17, 2010

What Makes you Mad about the current state of the IT Industry?

We recently started a poll on LinkedIn to ask people what makes them mad about the current state of the IT industry. The results so far are insightful, but perhaps not surprising.
Many people sited short-termism and lack of a coherent strategy as a key factors.
Others spoke about how Management (and in particular Management Processes) were preventing innovation and stopping IT from delivering what the Business wanted.
There were also a significant number of comments about the way in which employees are treated within our industry, citing lack of commitment to training etc.
When the poll is closed, I am hoping that we can publish the summary analysis on this blog. We will also share some thoughts at the upcoming Service Desk and IT Support Show which we are attending in April.
In the meantime, this is your opportunity to share a brief rant with your fellow IT professionals. It could be the constant pace of change, the lack of training, the "fire-fighting" culture, the lack of proper processes (or maybe too many?)....
Whatever particularly drives you mad about our industry, feel free to share it!
A famous entrepreneur once said that "every good idea begins with a rant".

Friday, February 12, 2010

Is "Big Bang" a good way to implement Infrastructure?

I was recently involved with a UK client who have just cancelled their Infrastructure deployment project, which was originally going to be implemented as a "Big Bang" deployment.
Part of the reason for this was the underlying risk of such a disruptive deployment.
To put this in context, when I was first brought in to review the project, I realised that the new technology was so disruptive that it would actually be far easier and quicker to have a clean switch-over, rather than trying to incrementally upgrade the infrastructure. My view was that the risk could also be managed.
Since then, the Client's Business has moved on, the risk analysis was reviewed, and they decided to move forward incrementally. This will mean only getting (say) 40% of the improvements in the same timescale. Nevertheless, because of the changed business circumstances this makes sense, and so I supported the change of Strategy.
As an IT Production Consultant, I am generally unhappy with big-bang deployments. I prefer the gradual incremental approach which is more risk-averse, and more in tune with the culture of IT Production generally.
It would be interesting to have comments from other consultants in this area.

Saturday, September 19, 2009

What's in a name? Do we call ourselves Infrastructure and Operations or IT Production?

"Infrastructure and Operations" appears to be a recognised market segment these days. It is a useful descriptive term, since it covers the main aspects of the IT Production role:

  • Infrastructure - looking after the equipment, hardware, software, networking and other technical stuff which modern IT needs to have in order to run day-to-day
  • Operations - the processes and behaviours required to look after the "stuff" (the Infrastructure).

However, I personally prefer the term "IT Production", for a number of reasons. In my view, it ...

  • Is simpler and easier to remember
  • Highlights a logical contrast between "Development" and "Production".
  • Implies a single organisational structure dedicated to a single purpose.
  • Defines a clearly recognisable marketplace for tools and services.
  • Recognises the importance of the "after go-live" part of IT, as a discipline in it's own right.

The last point is the most important: Whilst IT Development enable a business to gain competitive advantage by using technology, it is the IT Production side which actually ensures that the competitive advantage is realised.

One of the areas we are speaking to Gartner about at the moment is the importance of terminology, and the use of IT Production as the recognisable term for what we work in.

Names do mean something. They confer expectations, and status. And IT Production needs to receive the status which it deserves.

Which, of course, means that we have to start delivering to a higher set of expectations.

Saturday, August 29, 2009

Understanding and Managing People who Manage IT

For some time, we have advocated using the four "MOPS" as a means of identifying how to improve the management of IT Production.

However, although these "MOPS" are necessary in order to improve the management of IT Production, they are not sufficient.

IT Production Management is also significantly a people skill. Technical Managers can often benefit from a scientific approach to understanding and managing people.

The use of personality profiling is not particularly common in the IT industry at the moment. However, it may be gaining ground. It can enable IT Managers to ask questions such as “How can I…

  • Improve the way I communicate with my team and peers?
  • Enhance the motivation of my teams?
  • Identify the strengths and weaknesses of a person/team and maximize his/her performance?
  • Best manage a specific person/team?

One such technique is the Birkman Report, an internet-based assessment system. It describes your unique style of leadership - your goals, your approach, what motivates you to lead, and what happens to you under stress.

Armed with this information, the IT Manager is able to develop and refine their leadership skills.

Legal Stuff

The MOPS ™ acronym is trademarked by Dennis Adams Associates Limited. The acronym stands for Metrics, Operational Tools, Processes and Standards, the four foundational requirements for IT Production Management.

Birkman Direct ® is a registered trademark of Birkman International, Inc. Copyright © 1989 – 2002, Birkman International, Inc, Houston, Texas. All rights reserved. Only BIrkman-certified consultants or persons working under the direct supervision of such consultants, are authorised to give you information relating to the BIrkman Report.

Friday, July 24, 2009

How is the economy impacting IT Consulting?

How bad is it really?

We have all heard a lot of feedback over the last few months about how dire the IT consulting market is. I have to say that, at the moment, things don't appear to be as bad as some people are saying. Of course, there is always the effort in trying to get new prospects to part with their money, but that's part of business !

We certainly haven't been inundated with requests for work, but there does appear to be an appetite among some companies to bring in people. Who knows, maybe the worst is over?

Having studied economics (a long time ago !), I am aware that many of the economic indicators are subject to a "lag". In other words, we only know that we have come out of recession about 6 months after it actually happened. The same occurred when we entered recession, as you may recall. We kept getting economics reports saying that we were already facing a crises, and that it had been going on for months.

Consequently, I think that a more accurate indicator of the state of the economy is typically the extent of the "feel good factor". Speaking to CTOs and others, I get the impression that they are feeling more positive, and have more budget to spend than previously.

Meanwhile, it's a case of chasing people to close the next business deal...

Thursday, April 23, 2009

Oracle Buys Sun. A Natural Progression, or Unnatural Mistake?

Can the Software Giant make sense of Hardware?

Java and Solaris are the prize.

The announcement that Oracle will be taking over Sun Microsystems has generated a huge amount of reaction in the blogs and within the IT industry generally. There have been questions about what Oracle's strategy is, what the future will be for Solaris on Sparc, where the free MySQL database lives etc. etc. There have also been been some questions in some minds about Larry Ellison's sanity. It has certainly been a bold move. Some claim to have seen it coming - I certainly did not.

My own interpretation is that Oracle are being opportunist. There was an ailing company - Sun Microsystems - whose heydays in the .COM boom were long gone. They had a huge commitment to R&D without much to show for it. They have some distinctive products (the Sparc chips), some OEM market (Storage from Hitachi), and some very interesting free, or nearly free software and commitment to the Open Source world (Solaris 10, Star Office which forks development in Open Office, and MySQL). They also own the JAVA stack. Maybe they are worth a few billion, even if they don't currently make a profit.

The Profit Motive

And that is the key point. If Oracle is about anything, it is about a business that exploits their assets to make a profit. I suspect that there was no "grand strategy". Charles Wang, the former head of Computer Associates, once said that at the level he worked, people "make it up as we go along". Oracle is driven by a Profit Motive. That, and a hatred of Microsoft. Add to that the fact that IBM and Cisco are circling around Oracle's historical profit levels, and the deal makes sense.

More than just a database company

Oracle has been more than a database product company for many years. They started with the database, but over the last years have positioned the company as an Application Platform. Oracle Financials was one attempt. Then add PeopleSoft, and myriads of other acquisitions. So they have diversified away from the database. If you look at their latest figures, you see that the Oracle database itself is less that 50% of the revenue of the company as a whole. So this is an exercise is diversification. Move up and down the software stack to ensure that you can offer everything the customer could possibly want. All at a profit.

Hardware is not Software, Larry

Oracle has tried to move into the Hardware space before. They created a product called "Raw Iron" which was an embedded hardware product for running the Oracle database. Coincidentally (maybe?) this was based on Sun hardware. There is a very interesting FAQ released by Oracle yesterday which says "Oracle's ownership of two key Sun software assets, Java and Solaris, is expected to provide our customers with significant benefit.". This suggests strongly that Oracle are still seeing Sun as a software vendor. Whilst Oracle have lots of experience in integrating companies, they have always been other software companies. Running a Hardware company is a different thing. The sales model is different, the lead times are different. And you have to ship physical equipment all round the world. It will not be an easy integration.

Predictions

Everyone else is making predictions. Usually, these are based on what the author would do. However, these are my predictions on what Oracle themselves will do. Whether they are accurate predictions, I will leave history to determine. Whether they are good business, that will be the realm of Economics.
ProductPrediction
Java I doubt if Oracle want to upset the Java community. In fact, I suspect that Java will become more open. Oracle's view will be - why do the work ourselves when there are so many willing volunteers to do it for us? Oracle want Java so that they can ensure that all their applications have a good strong application server stack. But watch out for "Oracle Extensions" to the main product.
Solaris Oracle claims that it can now optimize the Oracle database for some of the unique high-end features of Solaris. It has always had this option, but was afraid of "lock-in" to another vendor's product. I predict that there will be some new features of Solaris that Oracle can exploit. But there won't be much. They don't want to alienate the Linux users.
Sparc Chips This is the nub of the question. Oracle have said that they will grow the business. Oracle salesmen may clinch deals by selling integrated hardware alongside the application. They will be able to point to Oracle-specific APIs in Solaris to show performance gains. However, if Fujitsu decide to come calling, I would not put it past them to hand over Sparc development, and OEM the solutions.
Sun Storage If Oracle can sell storage at a marginal profit, they will do so. Particularly if it means software license sales.
Star Office Maybe this is a product that can be stacked against Microsoft. But you have to sell a lot of Star Office licenses to equate to a single Oracle DBMS license. Is it worth it? I suspect that, as with Sun, Star Office will be a sideshow.
MySQL Who cares? The fact is that MySQL generates relatively little profit. So R&D will be cut back. The product will still be there, but will rely on the OpenSource community to develop it. Oracle knows that MySQL is not much of a threat. It will be allowed to stand, or fall, on it's own.
It will be worth watching this one...

Thursday, October 11, 2007

Windows 2008 Server Core - back to the Future (Command Line) ?

Where's my command line manual? Slick move or desperation? The news that Windows 2008 Server will be available in a "cut-down" version appears to be good news from many different aspects, especially for people who will want to run IIS or SQL Servers. Firstly, the ability to de-install complex logic which is not required for the core work (such as the GUI) will reduce the "attack surface" - the number of possible entry points where hackers can gain exploits. The smaller the operating system, the less likely there will be vulnerabilities. Secondly, it must logically simplify the runtime behavior of the OS, and make it easier to maintain and manage This can only be good news for Sys Admins involved with Windows Web and SQL Servers However, there are always concerns which arise. Since the only way to dialog with the Server will be via the new command-line shell, the question arises "How long will it be before this shell exhibits vulnerabilities?" On the face of it, Microsoft have neatly side-stepped questions about vulnerabilities in Internet Explorer, Explorer, Media Player, etc. etc. by the simple expedient of removing it from the build entirely. If you want a server, just get rid of all the non-server pretty stuff. At the same time, it has to be acknowledged that Microsoft do appear to have listened to their customers. Most organizations with Windows Server build their own cut-down deployment version, particularly for the "Edge" or DMZ where the web servers live. Microsoft have just reflected this preference. Although in this case, the amount of "fat" that can be cut out is far more than currently available with Windows 2003. In my view, good Sys Admins are always able to do most of their work from a command line. It sounds as if this will become essential in the future. The circle is complete Back in the old days of NT 3.5, you may remember that the GUI was de-coupled from the core Operating System. At the time, this was a clever approach by Dave and his designers. It fitted well with the fact that NT was inspired by the VMS 32-bit kernel, and enabled the GUI and core development to follow their own lines. I also heard that NT 3.5 systems could run successfully even if the GUI crashed - and I did see a partial example of this back in the early 90's. The problem with this approach was that the overhead of context switching in and out of the GUI every time a window had to be moved, or a box drawn, resulted in a slow Operating System on the desktop. And at the time, Microsoft wanted to maintain the one Operating System for both the Desktop and the Server ranges. With the introduction of the Windows 95-style shell, Microsoft did two things. They re-wrote Windows Server with a 95-style shell, and they made the shell itself a key part of the Server, thus destroying the de-coupling which had been done in the original version. The result was a relatively slim, stable and fast Desktop Operating system. It was also a reasonable Server, as anyone still using NT4 will testify. The problem was that the GUI itself and all it's attendant add-ons (Internet Explorer, for one) resulted in a more bloated OS, which became inefficient and vulnerable to attack. And as the Server market began to ramp up, Microsoft began to question the benefits of keeping a single code base for the Server and Desktop. With the introduction of Windows XP (and later Visa) for the Desktop, it became clear that the code had to be forked. It was time to specialize. So now we are back to the era of slimmed-down, command-line-only servers which only have the software for their own specific purposes. Any extraneous generic functionality is stripped off. Come to think of if, isn't that what SERVERS are really meant to be like ?

Friday, November 03, 2006

Now you can buy SuSE Linux from Microsoft: am I dreaming?

The Microsoft - Novell saga continues Maybe Microsoft will stop selling Operating Systems??? I don't know if you found it a shock announcement, but it certainly confused me. Released on 2nd November is the announcement that Novell and Microsoft have agreed a set of broad business and technical collaboration agreements that will help their customers realize unprecedented choice and flexibility through improved interoperability and manageability between Windows and Linux. There you have it. There is even a picture of Novell Inc. President and CEO Ronald W. Hovsepian and Microsoft CEO Steve Ballmer shaking hands an all smiles. The agreement basically means that you can ask your Microsoft salesman to quote for X copies of SuSE under a reseller agreement. What is happening? Best of enemies / best of friends? There is a saying that you should keep your friends close to you... and your enemies even closer. Well, Microsoft and Novell certainly come into that category. Don't forget that Novell could be credited with creating one of the first practical file / print server, a workstation authentication mechanism, a small systems directory, a ... until these were trumped by Windows NT, and things like Active Directory. Even today, there are people who prefer Netware to AD and think that we should all be running token ring instead of NetBIOS (or whatever it is now called. Don't forget Word Perfect, one of the really good early word processors (under MS-DOS), until Word began to become the de-facto standard. I think the case has been made - Microsoft and Novell have a history of competition. So what does Microsoft do with competition? They either kill them off, or buy them out. So has Steve Ballmer gone soft? Some of the small print... On the technical side, the two companies will set up a facility where engineers will work on enabling co-location of Windows and Linux, using virtualisation technology. In addition, there will be common standards on web services management, interoperability between AD and the Novell Directory, and translators for MS Office XML file format and the Linux OpenDocument format. But examine some of the detail, and you begin to see what is happening. Firstly, the decision to sell SuSE via Microsoft is just a concession. I doubt if MS salesmen will be given priority commission rates on Linux sales. And Mr. Ballmer himself has been quoted as saying " If you've got a new application that you want to instance, I'm going to tell you the right answer is Windows, Windows, Windows." Pretty conclusive. So it appears that Novell will get precious little sales from this agreement. What they do get is time. They have effectively bought off the giant by feeding him some scraps. In return for a percentage of the revenue from every SuSE license they sell, Microsoft has dropped any intellectual property legal actions they may have against Linux users. So Microsoft gets some cash from SuSE sales (whether they contributed to the sale or not, they still get the money), and Novell gets some breathing space. I think Microsoft is the winner here. Novell needs to take advantage of this calm before the storm. They have until 2012 (when the agreement may run out) to build sufficient impetus to be able to stand on their own two feet. Of course, that assumes that the agreement goes full term. Some companies have been known to exit long-term agreements ahead of time...

Thursday, September 14, 2006

Does SOA spell the end of SAP

Is this the end of the SAP Consultant's gravy train?. Want a job as a SAP Consultant? SAP Consultants have always had an interesting life. The challenge centres around the fact that you cannot ever have a generic solution to a specific set of customer requirements. It is all very well persuading people in the Infrastructure world that a generic solution like Tivoli or Unicenter or Patrol will address their needs. Crudely speaking, Infrastructure is Infrastructure. But when you get into Enterprise Resource Planning you are touching key differentiators. Each company's balance sheet and profit and loss accounting is different. Touch that, and you must have a specific solution tailored to each customer. So what does this have to do with SOA? Everything. In the past, this customization has been done by highly skilled (and highly paid) consultants who would crawl all over the organization and require significant amounts business analysis before they could work their magic in customizing SAP. As a result, deployments of SAP could sometimes be measured in years and months, rather than the months and days which other applications may have required. I am not suggesting that there was anything wrong with this. It was simply a by-product of having to customize each deployment of SAP to address the specific business needs of the customer. But now things may begin to change. SOA - the next generation But re-writing SAP to conform to a Service Oriented Architecture (SOA) could potentially (and I use the work "potentially" deliberately) solve this challenge and reduce the time to deployment of SAP solutions. It might also reduce the complexity and deployment challenges, so that SAP could deploy upgrades and enhancements quicker. The SAP approach employs NetWeaver and Web Services, so that all SAP software will be SOA-enabled by 2007 This should be good news for Systems Integrators or Customers. In theory, with a common services interface, it should be easier to customize SAP, and/or integrate it with other Office or Enterprise software products. After all, there are many developers these days that know about Web Services. The recent announcement of SPA Discovery will also help smooth this transition. SAP could become part of a so-called "composite application" containing Business and Financial workflows. According to one Research Group, SAP's decision to use a model-based approach will make it easier to tailor the application to fit the business, not vice-versa. In short, SAP's pragmatic approach may reduce time to deployment and thereby increase the attractiveness of SAP to small and medium size businesses. To do this, SAP may well have to look at it's pricing model. SOA Footnote On a general note, there is also some disillusionment arising about the use of SOA these days. Even David Chappell is quoted as saying that software re-use (one of the key justifications for introducing a SOA) has failed because of the cultural and business barriers. So, for many organizations like SAP, the key benefit of SOA-enabling their software may be to open it up to the customers. This may mean that the age of the specialized SAP customization consultant may be numbered. On the other hand, I seem to remember them saying a similar thing when COBOL was introduced...

Sunday, April 02, 2006

Vista delays could open up Linux on the desktop

Will corporate users finally lose patience?. One more delay too much? The news has broken (not unexpectedly), that the latest copy of Windows - now to be called "Vista" has been delayed, at least until the start of 2007. Will this be a delay that would finally encourage people to move to Linux or other desktop technologies? If such a revolution was going to happen, now is probably one of the better times for it. After all, Linux is (almost) able to be deployed by a non-technical user, and it is backed up by OpenOffice, Thunderbird, FireFox, Samba and other compatible products that enable it to co-exist in the Microsoft world. Will people jump ship following yet another Microsoft delay? Reasons for the revolution It's not just the delays that users are upset about, it's the lack of functional benefits that they will get when / if they deploy Vista. Firstly, what is there in Vista for the average commercial user? The new File System, one of the new features which were touted some time ago, will not make it at all. Basically, it is just a new interface, which is a bit closer to the Apple "glass" interface, with the ability to have semi-transparent "3-D" appearances. Neat, but not exactly revolutionary. And not much, if people are being encouraged to move away from Windows XP. Of course, there is a new programming interface, and the wide use of XML to configure applications and define the interface. Useful for programmers CVs, but not exactly ground-breaking. In fact, when all said and done, there is not a lot of stuff in Vista which the average user will start writing home about. And that is the problem. What will users get for their money? In short, why should they abandon XP? The only real reason is that, one day, Microsoft will stop supporting it, just as they no longer support Windows NT workstations. So, to coin a phrase from Star Wars, "Fear will keep the local systems in check". Will it happen? My view is that the take-up of Vista by Corporates (who, let's face it, are the people who pay Microsoft the big bucks) is likely to be slow. Corporate customers want to see a return on their investment, and a new glossy front-end does not do it for them. There is also the fear of security holes. Just as XP has been locked-down by Service Packs and other patches, the last thing that corporates want is another string of security risks with a new architecture. In fact, Microsoft might get a better take-up if they promoted Vista as just a new shell to XP (just as Windows 2000 was promoted as "Built on NT Technology"). In short, I predict a longer future for XP. Corporates will wait and see. If there are security scares, or performance issues, then you could see an exit from Windows on the Desktop. Who knows?

Tuesday, February 28, 2006

Will Intel-inside-Apple become a corporate standard?

It will be software packages and interoperability that carry the corporation. Apple Inside? The decision by Steve Jobs to ditch Power chips for Intel might have made sense if he had taken it a year ago, but somehow I think he may have missed the boat. Intel, on the other hand, have found another outlet for their entry- and medium- level chips and given a sharp jolt to the anti-Intel camp (which, from what I have seen, appears to be growing daily, with the rise of AMD). So why did Apple suddenly decide to change camps? It's puzzling in some respects. There was an argument that the Intel 32-bit architecture with multiple core chips had a lot more power than Power. Certainly the new Powerbook G4 with Intel Inside has been reported to have better performance. But was the Power so bad? Not really, since you will notice that the battery consumption has dropped off with the new chip. Swings and roundabouts with any architectural design. Winners and Loosers Of course, there are downsides. One of these, which has barely been hinted at, is battery life. The Apple notebooks had a deserved reputation for long battery life. I know of one person who claims to regularly get 5 (five!) hours life from his G4 Powerbook. Not any more... The Intel chips requires lots of juice. So battery life will, like as not, be down in the medium of PC-type notebook standards. Can't have everything, I'm afraid. So what of the future? Some analysts have said that Apple's move to Intel technology is the beginning of a process towards opening the Apple MACOS X operating system to other hardware. How about purchasing a Toshiba or Compaq, and having MACOS X loaded instead of Windows XP? Sounds very tempting. After all, I can run Microsoft Office on MACOS, can still use email, can be authenticated with an LDAP environment, share Folders using Samba. Sounds promising to me. But will it happen? I don't think so. And the reason is to do with Drivers. One of the things that makes Windows XP so pervasive, but which also can lead to instability, is the fact that it works with pretty much any hardware technology you care to name, Dell, HP, IBM, Toshiba, Loveno, Tiny, Sony, ... the list goes on. In order to do this, Microsoft have had to invest in (or persuade vendors to create) device drivers that will work with these technologies. But therein lies the problem. The more drivers, the more complexity, the more likelihood that they will not easily co-exist. What happens when a NIC from one manufacturer has an IRQ conflict with a 17-inch display driver from another manufacturer? Most times, nothing. But the introduction of signed device drivers in Windows 2000 Server was one indication of the extent of the potential problem, at least in Microsoft's mind. Apple, on the other hand, does not have these sort of problems. They have one set of hardware, and that is all the MAC operating system has to work with. Any problems, Apple make both the hardware, firmware and the software. Co-existance is easy. If Apple introduced new hardware support, they would fall foul of the device driver issues that have bedeviled Microsoft these many years (ever since Compaq cloned the IBM PC BIOS, in fact). I don't think Apple want to go there. Whatever it's faults, an Apple is still a single-supplier solution. Incompatibility problems simply don't exist. Long may it continue

Sunday, December 11, 2005

Network Attached Processing - the next big thing for Java ?

Will NAP have the same all-pervading presence that NAS gained? It's not often that a new piece of technology comes along where people are tempted to say "why didn't they think of that before?" Yet I think I have found just such a technology. And if I am right, you will be hearing a lot more about it in 2006. The concept is called Network Attached Processing, and the company in question is Azul Sytems. When Java was first invented, it was decided that software code would not be compiled "natively" to any particular hardware or operating system. Instead, the compiled code (sometimes called "byte code") is designed to run in an environment called a "Java Virtual Machine" (or JVM). The JVM represents an imaginary machine architecture. In order to run Java applications on, say, Linux or Windows, it is necessary have a Java Runtime environment that maps the JVM to the specific target operating system architecture. Once the Java Runtime environment has been written, then all Java programs would be able to run on that target. This architectural approach lies at the heart of the "write once, run anywhere" mantra for Java. Obviously, if the underlying OS and architecture is relatively similar to the JVM, then the Java Runtime will be relatively easy to write, and should be very efficient. Ideally, then, an executing JVM should be located on an Operating System and Architecture that is architecturally similar to the JVM. On the other hand, there is a requirement to be able to run Java applications on Industry-standard Operating systems (e.g. to run Websphere or Weblogic or JBOSS on Solaris Linux). Azul Systems have provided a solution to this challenge. - Network Attached Processing, in the form of an Azul Compute Appliance. The Azul solution consists of a customised JVM or "VM Proxy". This VM Proxy receives incoming byte code for execution and forwards it directly along a network path to the Azul Compute Appliance, with executes it. On the Compute Appliance, the byte code is received by a "VM Engine", which then performs the Java compute operation, before returning control to the calling server's VM Proxy. One obvious advantage of this architecture is that the "client" Operating System still appears to be performing the same functionality as before, running the business Application. However, the actual CPU processing is being off-loaded to the Azul machine. Neat, or what? So what's the Azul Compute Appliance? In a nutshell, it is a 24-way (or up to 384-way!) multi-processing server which is specifically designed to run JVM environments very very efficiently. For example, it handles multi-threading very well (not surprising, if you could have 384 CPUs!), oodles of "heap" memory, highly efficient memory garbage collection etc. etc. So how about the best of both worlds ? Keep your existing application on your old server, hook in a Gigabit Ethernet card, and hang the Azul System off the other end. Better still, have multiple Servers being Compute Served by a single Azul machine. Sounds a bit like a Compute equivalent of NAS ? Yep - you've got it! Once the concept is grasped, all sorts of opportunities arise. Firstly, we are used to purchasing servers to host J2EE environments, based on their computing power. Instead, the host server becomes just a "mount point", a suitable O/S architecture for running the I/O and communications activity. The real processing is done in the "compute farm". What happens if you need additional CPU for this growing application ? No action required - just ensure that the compute farm is powerful enough ! The use of a "Compute Farm" suddenly changes the whole dynamics of Servers in the datacentre. Each Java server could just be a tiny blade (or a Virtualised server), providing it has the O/S and I/O capability for the application. Datacentre management of Servers would be massively simplified with NAP, just as it has been in the storage arena with NAS. Azul Systems web site is at http://www.azulsystems.com. I hear that they have plans to support .NET runtime as well in the near future. Definitely one to watch in 2006.

Tuesday, November 29, 2005

News Review: Ingres Open Source Buyout from CA

Yet another twist in the Ingres saga, as CA float it off independently. But have CA forgotten what they use Ingres for? It seems only yesterday when I was commenting on the decision by CA to Open Source the Ingres relational database (see Ingres Open Source = Graveyard or Second Life ), in July 2004. At the time, I considered that there was significant business logic in the decision. A genuine heavyweight Open Source DBMS would be able to compete with the likes of MySQL, with the ability to handle massive volumes of data. On the business side, revenue would come from commercial support, even though the software itself could be free. Now the world has changed again with the Management buyout of Ingres Corp from CA. The new Ingres Corp I suspect that the writing was on the wall for Ingres as a result of the reorganisation last April, when Ingres became part of the "others" division. Now, however, CA have sold the product and many of the staff to a Silicon Valley private equity firm, which itself has only been going for a year. However, one of the most impressive things about this transaction is the management team, which reads like a "who's who" of database heavyweights: * Dave Dargo (CTO): 15 years Oracle, including responsibility for the Oracle on (OpenSource) Linux programme. * Emma McGratten (Engineering): 11 years running the CA Ingres engineering team. * Andy Allbritten (Support Services): ex-Oracle VP for Support and Services. * Dev Mukherjee (Marketing): ex-Microsoft Servers General manager for Marketing. Ingres Corp is now a company of 100 employees (most of whom are ex-CA, who have moved across to the new organisation), with plans to grow it significantly. In addition the OpenRoad development tools (which are tightly coupled with Ingres itself) and other products like the Enterprise gateway are also included. Even the Follow the Sun support teams are moving. However, OpenRoad will not be Open Sourced yet (try saying that in a hurry!) We are promised that aggressive marketing will start soon. Coincidentally (?) the announcement was made on the same day that Microsoft released the latest version of SQL Server. That is called timing ! Reactions Looking at the Ingres Newsgroups, there is a mixed reaction. One DBA complained that his company was just in the middle of negotiating a move from the earlier closed-source versions of Ingres to the new OpenSource Ingres R3, just as the announcement was made. Others responded positively, since many had complained in the past that CA had not really marketed, or put RD into Ingres. With the new company there will have to be a strong focus. And existing customers should still be supported, often by the same people as before. CA's decision Whilst this appears, in my opinion, to be very good news for Ingres, I am slightly puzzled by why CA have agreed to it: Don't they realise that a huge percentage of their software is deeply reliant on the Ingres database ? Unicenter R11 is due to be launched, which uses Ingres as it's only relational database. Won't the marketing people at CA feel a bit exposed at seeing their products dependent upon a non-CA database? Challenges In the past, one of significant aspects of the Ingres sales proposition was that it was being actively used and promoted within CA itself. Therefore, anyone who bought Ingres would be purchasing a product with a significant locked-in installed base, which would guarantee longevity. Now, things have changed. The "used by CA" tag no longer has the same credibility. Ingres Corp is out on it's own in the wide world, and will have to fight on it's own against the likes of MySQL. The competition will be fierce. Oracle have just recently announced a free (yes - free!) version of Oracle Express for Linux. SQL Server has just seen a major enhancement. And MySQL is beginning to move into the big league, with an improved optimizer, views and triggers. One big question for all "Ingressers" (if indeed that is the correct collective name) is whether the Ingres development programmers and support team will remain with the new company in the medium term. One thing is clear to me, Ingres Corp is unlikely to have the same marketing muscle as Computer Associates. On the other hand, it should have a more focused approach to selling the product. I wish them well.

Saturday, October 29, 2005

News Review: Peregrine finds a home inside HP

The Prodigal Returns News that HP have agreed to purchase Peregrine must put some smiles on the faces of the existing 3,500 ServiceCenter customers. At last they can feel that the software house has a valid home, where it will hopefully get the investment and marketing effort it deserves. Peregrine has had an interesting story getting here. It was quite an acquisition-maker itself in the early part of the decade, including an interesting time "dating" Remedy (now part of BMC). Then it ended up filing under Chapter 11, and seemed to write itself out of the history books. The thing that seems to have saved it is that it has a reasonable product, at a time when every company is trying to get into the ITIL framework, by producing software offerings with the ITSM (or Service Management) strapline. HP's offering in the form of OpenView very much complements this, so I foresee a strong future for both products. The key factor for ITSM offerings is having a common configuration management database. This database should be able to tie together all the assets of the company (Servers, workstations, software licenses and installed applications), and cross-match them to the HelpDesk (so that incidents can be logged against them). This in turn means that Problem Management can drill down into root causes by looking at the Incident history. Then Changes and Releases can be implemented against these assets. So a common CMDB is vital - both for a good ITSM offering, and for a successful deployment of ITIL processes. Peregrine has potential to be such a product, particularly if it is well integrated with OpenView in future releases. One curious question: Why didn't IBM purchase Peregrine ? After all, IBM acts as the channel for a lot of the Peregrine products. And surely IBM would benefit from Peregrine's CMDB. These days, I don't hear much about IBM Tivoli. It used to be the market leader in management of mid-range systems and applications. Now we have BMC, Computer Associates and HP. Are IBM unconcerned about the ITSM market ? Or maybe they are just biding their time.

Thursday, September 29, 2005

News Review: Oracle + Peoplesoft + Siebel

If you can't make it, buy it. The recent announcement that Oracle will be buying Siebel had been fairly widely predicted in some areas of the press. Oracle are paying $5.8 billion for 4,000 customers. I guess this is relatively small money, compared with the $10 billion they paid for PeopleSoft. However, this leaves Oracle with a massive workload to integrate and get value from all their many products and offerings. The Support Challenge If you look through Oracle's acquisitions during the last few years, there is a huge amount of CRM software which they have in their portfolio: * Peoplesoft * Vantive (part of PeopleSoft) * JD Edwards * plus Oracle's own offering * and now Siebel How on earth could Oracle support that many different code lines for just one functional requirement? Some clues about how Oracle might wish to do this are in a recent article in Oracle Scene, the UK Oracle User Group Journal, about Project Fusion. Fusion is the name for Oracle's new Service-Oriented-Architecture (SOA), which is a way of building software applications which promote connectivity between applications. Oracle will need it!. The article explained how this approach would bring together the best of Oracle, PeopleSoft and JD Edwards. Perhaps SOA-enabling all these tools will work. But there is still a lot of (redundant?) code to support. Motivation So why did Oracle buy Siebel? Basically, there are a number of reasons for buying a rival manufacturer in the industry: * Gain technology, to update or improve your own offering * Take out a competitor, to enable you to charge monopoly prices * Defensive action, to consolidate against another, larger rival. Somehow, I don't think Oracle have bought Siebel in order to gain some valuable piece of technology. Instead, I think the second and third reasons are more likely In the case of third reason, the big rival is, of course, SAP. However, Oracle may have a large percentage of the market share of CRM / ERP software, but their offering is fragmented, unfocused, and expensive to support. There is a battle between Oracle and SAP. And any general will tell you that the organization which is able to mass all it's forces against a single point of attack will win the battle. Oracle is in desperate need of a good coordinating strategy.