I am on my way back from the best UKOUG conference I ever attended, unfortunately a lot earlier than planned. Before I start forgetting all these great moments it is time to write them up. To make use of James Morle’s words: if you weren’t there, you lose. I couldn’t agree more!
The Oak Table Network organised “Oak Table Sunday”, a hugely successful event on Sunday afternoon. This event featured some of the brightest Oracle minds, and thanks to a very relaxed atmosphere made it all a truly exceptional experience. I have to say that the audience was quite illustrious too-I didn’t recognise Paul Vallee from Pythian with his Movember moustache at first and to my great joy I finally met Piet de Visser again. After exchanging a few words with them I ran into so many people it was just great!
Unfortunately I couldn’t make it before the HA panel session, where Alex Gorbatchev, Dave Ensor, James Morle, Greg Rahn, Dan Norris, Graham Wood, Jonathan Lewis and Mogens Norgaard and all the others I just forgot to mention answered questions from the audience.
Inevitably, the Exadata discussion started. One member of the audience put forward the proposition that there should be an Exadata simulator. While everyone in the audience agreed, feedback from the panel was different. And I tend to side with the panel on this one. Sure, Oracle is very secretive about Exadata, but I expect a business reason behind it. As everyone who implements Exadata knows, Oracle wants to use ACS to provide support. Some customers however have decided not to treat Exadata as a black box and want to understand and ultimately support it themselves. Other consultancies such as Enkitec and VX Company develop a business model to provide independent support for Exadata implementations.
However, all of the involved parties (including Oracle!) struggle to find good, experienced, reliable and dedicated staff for Exadata support. Now why could that be? First of all Exadata is a lot less accessible. Unlike other Oracle software, it is not possible to experiment with Exadata without actually getting one. And renting lab time is probably a difficult thing to do as well. As Jonathan Lewis put it, you sometimes have a base plan of what you want to measure, and quickly develop different approaches to the same problem which require setup time. If the lab time is let’s say 48 hours, then spending a couple of hours on a test case takes valuable time away. Needless to say, lab time is in high demand and it might be that you have to wait a long time to be back.
Now the counter argument is of course that you couldn’t do this with a massive p-Series server either so the problem of testing your application on the new hardware has always been the same. Except that Exadata adds a secret sauce to the processing which you cannot get elsewhere ;)
So far I fully agree. There is no way you can try Exadata features without getting one, and if the initial assessment was wrong, your benefit of migrating to Exadata might be less than what you were hoping for. I do however disagree with the idea that a simulator makes that situation any better. You won’t be able to use a VM to find out if there is a performance benefit migrating to Exadata. First of all, your laptop or whichever other hardware you are using will not have the power to simulate a cell two hex-core X5670 and 24G of RAM. Neither will you have a F20 card used as flash cache in the real cell. Also, even at the low-end of the Exadata product range the system comes with three cell servers…
Having said that I do think that it would be very nice if cell patching could be practiced outside the really precious hardware for which getting downtime is probably very difficult. Every RAC DBA has applied a bundle patch in the past, or a PSU for that matter. That’s easy, and not different on Exadata. A cell patch however is a completely different thing, especially with the added twist that a completely broken cell can have a big impact on availability and processing power. So please Oracle, do something and let us test cell patches in software!