By doing a phased deployment, just as Crowdstrike should have done in the first place. This is of course just one of several things that should have been done and would have caught the problem without locking up millions of machines that people depend on 24/7.
I'm not an IT person, my career has been in embedded programming, but so much of this is just common sense when you think about all the things that could go wrong.
I'm not an IT person, my career has been in embedded programming, but so much of this is just common sense when you think about all the things that could go wrong.
My question was not about the phased development ( or better testing )... but how could Microsoft have prevented it? It's not their job to validate everything.. their is not the Apple model.
Embedded, real time... the only kind.
I wrote a task scheduler one... delivered the code as a relocatable library and provided an API:
void*** func(void***)
Just for kicks, because in reality I was returning a pointer to a handle and wanted to really confuse the moron who thought he knew everything. I got it implemented because the people using it really knew C.
Embedded, real time... the only kind.
I wrote a task scheduler one... delivered the code as a relocatable library and provided an API:
void*** func(void***)
Just for kicks, because in reality I was returning a pointer to a handle and wanted to really confuse the moron who thought he knew everything. I got it implemented because the people using it really knew C.
You miss my point or my thought, My suggestion of the use of the terminal was only for the use case of a display that needs very little compute. Basically just a display. If it were a ground up build possibly avoip or hdbaseT from a local controller. I am not knocking distributed computing in any way. I will say that cost is the driver behind the evolution of distributed and consolidated computing. Single node compute power was simply to expensive in the not to distant past therefore you had to log into a more powerful device to accomplish cpu intensive tasks. I have been told stories about complete printing infrastructure based around a unix box and a few printers. The reason is the cost. Now we have so much power per host or node (I am not sure of the parlance) at such a low cost distributed is the only way to fly for most everything. Now on the other side of the coin is the more intensive situations such as weather, global stock market trends, nuclear impact testing, BigData, (I am using big data correctly) that force a small sector back into consolidated model for science and a large sector for things data base intensive."wyse 60" booted from a virtual server / service.
For the last 50 years we've been working hard to implement pervasive, distributed computing and you want to go back to the 60s? Just because IT is too incompetent and insecure and their knee jerk reaction is to take over control over everything...
No thanks.
I don't even want to install "apps" in my devices.
It's html or the highway for me.
But then over in the R&D World we don't let IT within 100 miles of our stuff. They're likely to break it.
Also I believe that the use of a server / service / webapp to do everything is crazy as if we need a remote server to play a single person game, or flip a wi-fi switch from the local network. I have a 12 threaded machine with 24gigs of mem and mirrored storage yet the I am expected to maintain an internet connection and log into a web service to type a letter, set up a low level wi-fi router, draft a 2d object for my cnc, use gps (even tho the devices have gps hardware) it's rediculas that this is the standard MO.
@tonyEE Found this in the junk pile of old tech, it is not what we (the community at large)
Because once they have your data they have you. If you've got (say) 1Tb of data you're not going to shunt it elsewhere on a whim, there's a lot of inertia and in many instances people will be paying rent for the rest of their lives. It's a good business to be in.yet the I am expected to maintain an internet connection and log into a web service to type a letter, set up a low level wi-fi router, draft a 2d object for my cnc, use gps (even tho the devices have gps hardware) it's rediculas that this is the standard MO
It would make more sense to buy some sort of NAS, use RAID for disc redundancy and open a port on the router so you can access it. But most people just don't know enough to be able to do it securely if at all. Then again a NAS+discs is going to be north of $1000 if you're mirroring and want a decent capacity.
The guy in the videos posted earlier was suggesting that the files sent to the PCs actually contained something akin to P-Code which was then run in an interpreter in the driver. The reasoning being (as I mentioned earlier) the release process for drivers is laborious and you can't afford to wait for weeks to block a zero-day. But it is effectively sidestepping the WHQL certification process that Microsoft have insisted on for Kernel drivers so they don't murder the OS.By doing a phased deployment, just as Crowdstrike should have done in the first place. This is of course just one of several things that should have been done and would have caught the problem without locking up millions of machines that people depend on 24/7.
However it's dealt with, it's not going to be easy or quick to change things...
Wow, just wow. I sell software I write. I've had a few booboos I sent to customers, and as a result have given them a free year next time the lease was up not to mention an immediate extra copy or two or three to help them recover the time lost. 10 bucks. Seriously? 10 bucks is like what 5 minutes of time now.The whole mess gets a little more amusing for users of that product. A 10$ gift card for their troubles. Just do not expect to redeem it.
https://techcrunch.com/2024/07/24/crowdstrike-offers-a-10-apology-gift-card-to-say-sorry-for-outage
I guess I am not explaining myself correctly or you are playing "devils advocate." Or maybe you are trying to "catch me out."Because once they have your data they have you. If you've got (say) 1Tb of data you're not going to shunt it elsewhere on a whim, there's a lot of inertia and in many instances people will be paying rent for the rest of their lives. It's a good business to be in.
It would make more sense to buy some sort of NAS, use RAID for disc redundancy and open a port on the router so you can access it. But most people just don't know enough to be able to do it securely if at all. Then again a NAS+discs is going to be north of $1000 if you're mirroring and want a decent capacity.
Centralized / cloud management / server side programs have there place but it is not in the use cases that I mentioned.
If someone is not mirroring there online storage that is an ID10T error. If someone is using online facilities for there active data environment I can not think of a use case that would involve anything close to a terabyte but again without a "hard" backup its ID10T.
The prevalent use case for online active data ah-la Office 365 or google docs is low volume by non engineer / IT / IFS / infomatics type people. I use a similar system for software that I might need on the fly and low security file sharing / live documents. With these systems you basically get free distributed file systems with fantastic file syncing. All you have to do is click a button to have the data available off line and you will never be under the thump of the company that is trying to ransom your data.
Now your other use case of large data sets that may need a NAS in the home environment other than backup and video storage and low power consumption device. I think that in most cases a simple duel drive usb appliance would do fine.
No to make myself clear These are not the solutions that I use or that I think anyone on the tech end of the spectrum should use but we do things differently, our focus is different, and most importantly what we do is different.
I can only describe my environment for obvious reasons.
The computers and tech around me right now at random are
1: this PC which is the shop computer with 3 32inch displays and 1 42inch display so that I can view streaming shows and design drawings (wood working) from any location in this shop.
2. A laptop that is setup with websockets to control an rs232 reciever. (just goofing off trying to learn a little about the background goings on with Kramer control, Rti, and control4)
3. An old HP 6300 duel core running debian / virttualbox Home assistant vm (trying to learn a little about the Home assistant system)
4. Cisco 3750x POE+ (working with some Aruba instant on stuff and this is the only POE switch I have lying around.)
5. And an eero AP ( got them for free and set them up in as APs they only handle the wifi no router functions.) They are great plug and play mesh system but if I cant log into the device it dosent last long around here.
6. Just cause I am close to them I will mention the 3 HP "left hand SANs" and the half dozen DL360s ( all free stuff)
NOTE: no of this crap is part of my core system that I use for personal and houshold managment (not digital houshold managment), business comunications, field work. Just another example of how We use computers differently.
My poit is that Our use cases are so far from the average user that its hard to say what is correct or even necessary for that environment. We certanily can say that he environmet is substandard, wastfull, unusable for our needs; but not much more past the normal "you should have 2 backups" type thing.
There is another computer user around me that has a desktop and laptop. All accounting, her shop entertainment, bulk storage, forume managment, and various other things happen on the Desktop. She has an Office 365 account with 1TB of storage that is locally mirrored.
Her 2nd backup is via an external drive. Perfect solution for 9 bucks a month. Would that be acceptable to Us? no chance.
Sorry about the rant but what the heck , this is the Lounge.
Last edited:
I don't try to "catch people out" or play devil's advocate. It's just a misunderstanding of what you were saying...I guess I am not explaining myself correctly or you are playing "devils advocate." Or maybe you are trying to "catch me out."
Do what works for you. The whole point is that we have an environment that makes us productive and happy in what we do, both at home and work.
The point I was making was a generalisation about the cloud and people's growing dependency on it, and how many companies see it as long term rental income.
I am with you %100. And the business practices of some of the cloud service providers is criminal. For example this eero that I have set up is very easy to "plug-and-play." So the lay user just sets it up and away they go with seemingly no problems until any administration or advanced configuration needs to happen. Also there is a subscription to unlock advanced security features. You loose internet and you quite possibly will lose wireless connectivity after a token's expiration period. You know it's the data / functionality ransom if the device can't phone home. Or my all time favorite. You cant set up this $40 low level wi-fi router without logging into an account that requires way to much information to create and letting that cloud service telling the device that its OK to operate and be configured.the cloud and people's growing dependency on it
Not to mention the everyday mundain tasks that the lay user has been dupped into doing in the cloud.
For a fee no less!
And you know what? I am lazy when it comes to using cloud services for stupid stuff but I am not dependent. It frightenes me that so many are.
So what you're saying is that they've put in a hook that could also open a back door to others?The guy in the videos posted earlier was suggesting that the files sent to the PCs actually contained something akin to P-Code which was then run in an interpreter in the driver. The reasoning being (as I mentioned earlier) the release process for drivers is laborious and you can't afford to wait for weeks to block a zero-day. But it is effectively sidestepping the WHQL certification process that Microsoft have insisted on for Kernel drivers so they don't murder the OS.
Yes and NO, but as has been discussed, you have to operate at the kernel level to protect a system from kernel level exploits (a little over simplification but not a ton.) Therefore you have to have a signed driver to operate in that realm. Now you are task with protecting the system from zero day attacks, as in a named pipe exploit. The fastest way to get the protection onto the system before the the system can be attacked by said exploit is to feed information to the kernel driver. In this case a config file. There is a connection to the kernel but it is not wide open if you could get a file onto the system i suppose you could crash the system but I don't know if you could control it. The software / driver that is reading the config file would likely catch the offending code and crash the system or allow the code to try and run and that code throws a exception and crashes the system. So its a crash but not a take control situation.
But I too have the same question as you
But I too have the same question as you
No, I don't think so. But I'm most emphatically not a security expert.open a back door to others
My concern is that they've shortcut the normal (and extensive) kernel driver validation process Microsoft have set up to ensure that kernel drivers don't bork the system.
Of course we'll know more in time.
But when you've got software deep in the most critical part of the operating system you really have to have processes in place to ensure nothing goes wrong.
Clearly Crowdstrike didn't.
Possibly the most innovative solution to recovering a CrowdStruck machine. When a server boots it displays a bitlocker barcode. Use a barcode scanner pre-programmed with bitlocker keys and hostnames and plugged in to act as a keyboard, scans barcode and enters the correct key. Much much easier than typing the 48 char keys by hand. Really clever and cut recovery times to two or three minutes per box...
A mac user that has enabled timemachineRelated subjects: how many of you have multiple daily updated backups for when your PC goes on fire?
Jan
- Home
- Member Areas
- The Lounge
- CrowdStrike