Monday, October 22, 2012

Italy does it again

http://www.earthmagazine.org/article/hazardous-living-geologists-be-charged-not-predicting-earthquake
How is this possible?

Sunday, April 11, 2010

Brandchef: Besætning bag brand på Scandinavian Star

Brandchef: Besætning bag brand på Scandinavian Star: "Den første brandmand, som ankom til katastrofeskibet Scandinavian Star, mener at besætningen må have påsat branden og derefter saboterede slukning."

Man skulle give besætningen fuld immunitet i dag, så lang tid efter, og så gå efter alle der kan have haft kendskabe til planerne. Hvis Lene E kan få klø for at skippe et planlagt møde, hvad skal man så give folk der kender til planlagt brand-påsættelse?

Thursday, March 11, 2010

What could you do with sub 20k USD x86/x64 server with 48 cores at 2.2 GHz

Using NetBSD we match the performance of one of Cisco's new routers utilizing their new Cisco QuantumFlow Processor - well except for the power consumption. Though our solution is non-green, it will take more than a couple of years before the additional x86 power-cost catches the Cisco equipment+IOS cost.

At a higher layer, you finally have the message-hub HW platform you never knew you needed when you first introduced your CIO to the ESB :-)

You realize that, given enough spindles, this 48 core machine actually makes the NCR/TeraData warehouse scale linearly when you query multi TB data.

Someone copyrights the CloudInABox term, and Intels sues them and wins, because it is too close the their future concept of CloudOnAChip.

You don't have to buy additional hw to handle 20.000+ concurrent Exchange users - but you will get ripped of by VMware and Microsoft.

Someone tries to upgrade their maxed out 4xQuad 3.33 GHz OLTP database server to this 48 core 2.2 GHz cloud-box - and they fail - because the query-queues fill up.

Thursday, February 25, 2010

Next stop - go to jail for a non-blocked free service

Read this http://googleblog.blogspot.com/2010/02/serious-threat-to-web-in-italy.html, and imagine who else should be or should have been prosecuted.

Quotes
- "In late 2006, students at a school in Turin, Italy filmed and then uploaded a video to Google Video that showed them bullying an autistic schoolmate"
- "Nevertheless, a judge in Milan today convicted 3 of the 4 defendants — David Drummond, Peter Fleischer and George Reyes — for failure to comply with the Italian privacy code."

Who else contributed to this specific privacy code violation?

Producers of the video-equipment:
- the recorder company employees
- the tape/media company employess
- broadcasters - here I mean all the internet-providers transfering the IP-packets with the video - ISP employees
- YouTube employees
- all the viewer-software and viewer-machine employees
- human viewers
- anyone viewing pictures of the video
- owners of video-copies in any form on harddisks/flash/backup-media (not necessarily having seen the video)

Guess how it adds up if in-direct privacy code violators could be made responsible
- tape/media-plastics producers
- electricity producers
- Video and IP-packet standards and those who defined the standards
- reporters covering the news
- me - for commenting on it

What can you learn from this?

- You really has to watch out for - who you work for and what services and products your employer is delivering in Italy on your initiative
- YouTube will have to close down its Italian services to protect its employees
- All video service providers in Italy are gambling with their employees freedom
- Destroy all your Italian holiday video to avoid the risk of going to jail
- Don't record any future video in Italy

Sunday, March 11, 2007

Working at the Office and at Home

Currently my employer pay for my scheduled work - also the part that I choose to do at home. Unfortunately my employer does not sponsor my home PC nor the software necessary to do the work - specifically edit Office files at home. Still I sometimes work on them at home - but I do not feel obligated to do so - especially since the impediments are quite significant.

Luckily we use Microsoft software at the Office and my home PCs free SUN software supports editing just about any file I bring home. I am not lucky enough to have an employer that will pay for a personal laptop for me nor pay for software for my private PC.

Well, to bring home an Office file for editing seems easy, but actually it is not. Whether I choose to use email-attachments or my private USB-Memory-pen is insignificant. Neither allows for any impulsive editing, since both mediums require significant planning.

Before leaving work - each day - I need to evaluate which files I can work on. Then I need to copy those files - to be able to work on them if my private time has an open space and I feel motivated to do so.

Time and time again, I faithfully take a copy of some files with me. At home the plans changes, and I never get around to edit the copied files. They mostly end up as wasted copies.

Sometimes the private time-plan changes and brings an openings where I feel motivated for editing a file. Too often; requiring a file that I did not bring home. Maybe I have it somewhere - in an older baseline. Usually it is not the the latest edition, or I can not be sure it is the latest edition.

The above mentioned limitations, the waste and the subsequent frustrations demotivates my pursuit of integrated work and private time. I sure hope this is aligned with my employers strategy.

Solving a possible integrated file editing strategy misalignment has many options: Groove, Collanos, VPN file share, SharePoint or KnowledgeTree just to name a few.

Using larger free USB-Memory-pens is not an option for the long run, because it will always require a significant manual labor - to take the pen with you home every day.

Sunday, February 25, 2007

To SOA or Not To SOA

While we are used to developing systems of software, we are fare from confident with software-solutions consisting of multiple systems.

This is probably one of the reasons why large enterprises where the first to embrace SOA - they have lots of systems to integrate. This is probably also the reason why conventional system-development has been slow at embracing SOA - they do not have the integration need - so some COTS-system-producers are even reluctant to provide SOA-ready interfaces to their stand-alone system. What they do not realise; they need a SOA-ready interface just as much as they need their own GUI, otherwise large enterprises will not bye their otherwise great stand-alone-system.

Why was SOA invented? As enterprises grow and information systems where added for each business process to control, their systems portfolio also grow. Then enterprises wanted to have integrated applications or at least integrated data in their stand-alone systems, so they started doing point-to-point integration based on some narrow focused need for specific data (or even worse, they duplicated data in different systems). Sponsors where often owners of individual business processes with no current need nor budget to integrate all the systems. The results; in a couple of years distinct departments had sponsored a point-to-point integration-solution. Their unplanned point-to-point integration-solution turned out to be tremendously expensive to maintain. Especially as the systems continued to evolve, be upgraded, and replaced. Somehow all the point-to-point data-integrations just remained, and the enterprises had no sponsor ready to replace all of their expensive point-to-point data-integrations, and what could they replace it with? There where no COTS ready to help replace their data-integrations. Okay, some COTS did exist a couple of years ago, often unknown. Did you know Ascential Software or DataPower - before IBM bought them?

Finally the enterprises point-to-point data-integrations became a transitive closure of system-connections. Here is an example with just 7 distinct systems, having (n-1)+(n-2)+..1 = (n-1)*(½*n) connections or 21 two-way connections. This data-integration example could include 42 distinct system-interfaces where each system-interface could have its own data-transformation component:
Then enterprises data-integration-solution started to cost more than the individual systems; finally management could see the business case for an improved data-integration architecture. Luckily the COTS now had the proper brand and a sales force to sell them. The international business machine behind the proper brand eyed a large opportunity (or threat if you will) in their share of existing enterprise data-integration budgets.

If enterprises are lucky, the above example will look like this in a couple of years:
This could include one new domain-model of the business, 7 new data-transformation components, and one expensive Enterprise Service Bus (ESB). This new integrated data-exchange-solution will save the enterprises a great deal of money, and as an added bonus let them integrate new systems in no-time. Some also believe this will allow enterprises to rethink their business because the ESB allows them to build and deploy new business processes - in no-time. Well, let us see the ESB's capability before we give them too much credit.

Thursday, February 1, 2007

On sale: A Lean and Agile Black Belt Level 5 Scrum - with a Cristal Clear touch of eXtreme

I was unable to re-find the fine article by a very accredited software process professional. The article I am seeking is the one about re-inventing the process model over and over again.

Anyway once you have a broken process, you will never be fully comfortable with performing it. Many people ignore the hiccups and the annoyances, however some one might try to improve on the process. This is fine. Now the trouble starts. The one who actually succeeds to improve the process, thinks the new super process is globally applicable, writes it down in a book, sells some books and talks about this new super process.

Until now, no real harm has been done, except for the $$$ and hours your company and you has spent on the books and talks. The real trouble starts when you and your company does not understand the context of the new super process, and you start to apply the new super process in the context of your own company.

The trouble is, that a model is not a one-size-fits-all. A model is an ideal and filtered image of some past reality, hopefully based on some empiric observations. Thus a model is always some kind of interpretation of the real world (what ever that is).

Once you start to run your own copy of the new super process model, it might act up, especially if it is a complex model. And surely a few feedback loops does not help, neither does any kind of external noise - periodic or not. This might be why science like simple models over complex models.

So what you realise is, that your copy of the new super process does not perform better than your own old process. But once in, the new "super" process cannot be uninstalled. You can try to erase it, distort it, or even try to forget it, but it will newer be the same as before.

Then you live with it, learn to love it, and new generations will learn it and pass it one. Until someone cannot stand the hiccups and the annoyances, and finally decides to try to improve the process - with success. Then writes a book and gives some talks, and you know the rest.

What makes this story fun, is the fact that this kind of knowledge-transfer is highly regarded in our western society (if you pay for the IM copyrights that is). You pay a lot of money for your copy of the new super process. And everyone knows that it is hard to make such a transfer succeed, so you choose the most expensive consultants to help you introduce their copy of the model of the new super process. And of course you end up with a - most of all - expensive process copy.

My wisdom: Learn from other's mistakes. Make your own process improvements (how to succeed on this is a completely other topic - a BI-topic, or business intelligence topic).