Laszlo lands $8 million to expand applicationsCongrats!
Laszlo Systems on Wednesday is expected to announce that it has raised a series C round of $8 million, led by WI Harper. Altogether, it has raised over $26 million. The San Mateo, Calif.-based company, which makes tools for building interactive Web applications, said it intends to use the funding to invest in marketing, expand its partnership program, and develop Web applications.
Tuesday, September 05, 2006
Laszlo gets some
Some good news in Laszlo-land from CNET.
Sunday, September 03, 2006
Testing before updating corporate client software
How many more times do we have to read one of these "virus definition update breaks things" stories without saying to ourselves "pshew, it wasn't [our brand of AV]."
I'm increasingly wary of simply trusting vendor QA to ensure that we won't have problems caused by automated AV definition updates. And as we have more components of the system (firewall, antispyware, who knows what's next) that follow similar auto-updating, that particular problem will only become worse.
(The same issue applies with system, application, and middleware patches and updates.)
Now, what we should NOT do is stop updating virus definitions (or stop patching & updating systems, applications, and middleware). That's foolhardy: the risk of remaining unpatched against known security exploits, or unable to detect the latest viruses, or living with bugs that have already been fixed is a known Very Bad Thing.
What we need to do is make it safe to update definitions (and patch, and update) frequently.
At a minimum, we should with every definition set install it and do a full scan of a baseline machine and verify that nothing was detected as a virus. The definitions do not go out until that test is passed. (This rule may be bypassed if there is a current SIRT event that the updated defs would mitigate.)
It should be achievable to have a test suite of basic functionality for the desktop image. We already have many of the necessary pieces throughout the broader ECC group, and elsewhere in IS. Assembling them into a manual test plan (v1) is quite achievable. Automating some parts of the test plan (v2) should also be achievable - especially if we assign the task of executing the manual test plan every time there is an environment change to someone with scripting skills (or teach someone who knows how to code in general how to use a scripting tool); you can bet they'll be scripting away by the 3rd runthrough of the manual test plan. However complete automation seems improbable for v2, focus on automating the most annoying parts to execute manually, and add in any easy timesaving automation too.
In a utopic world we would be able to install the defs onto a fleet of vm's, all loaded with those applications that have sufficient business criticality to justify packaging them and setting up an automated test script, run the vm's through the script, and again stop the updates from going out to production if we get failures.
(That utopic capability would of course be used EVERY time we had a baseline change, not just for virus definitions.)
I'm increasingly wary of simply trusting vendor QA to ensure that we won't have problems caused by automated AV definition updates. And as we have more components of the system (firewall, antispyware, who knows what's next) that follow similar auto-updating, that particular problem will only become worse.
(The same issue applies with system, application, and middleware patches and updates.)
Now, what we should NOT do is stop updating virus definitions (or stop patching & updating systems, applications, and middleware). That's foolhardy: the risk of remaining unpatched against known security exploits, or unable to detect the latest viruses, or living with bugs that have already been fixed is a known Very Bad Thing.
What we need to do is make it safe to update definitions (and patch, and update) frequently.
At a minimum, we should with every definition set install it and do a full scan of a baseline machine and verify that nothing was detected as a virus. The definitions do not go out until that test is passed. (This rule may be bypassed if there is a current SIRT event that the updated defs would mitigate.)
It should be achievable to have a test suite of basic functionality for the desktop image. We already have many of the necessary pieces throughout the broader ECC group, and elsewhere in IS. Assembling them into a manual test plan (v1) is quite achievable. Automating some parts of the test plan (v2) should also be achievable - especially if we assign the task of executing the manual test plan every time there is an environment change to someone with scripting skills (or teach someone who knows how to code in general how to use a scripting tool); you can bet they'll be scripting away by the 3rd runthrough of the manual test plan. However complete automation seems improbable for v2, focus on automating the most annoying parts to execute manually, and add in any easy timesaving automation too.
In a utopic world we would be able to install the defs onto a fleet of vm's, all loaded with those applications that have sufficient business criticality to justify packaging them and setting up an automated test script, run the vm's through the script, and again stop the updates from going out to production if we get failures.
(That utopic capability would of course be used EVERY time we had a baseline change, not just for virus definitions.)
Subscribe to:
Posts (Atom)