It is not opt-in by default for business users. The feature flag doesn't show in org policies and github states that it's not scoped to business users.
Gah - you’re right - but given that I don’t use personal copilot - but I do manage an organisation that gives copilot to some of our developers AND I was sent an email this evening making no mention at all of business copilot being excluded it could definitely have been communicated better…
> Again, your organization's Copilot interaction data is not included in model training under this new policy, but we are excited for you to enjoy the product improvements it will unlock.
Had a similar initial reaction, but I guess it's the idea that a 10yo computer for a third of the cost runs circles around the Neo in terms of flexibility and horsepower, which says something about the Neo's character, for sure. I'm certain the Neo was not intended to compete for those prizes, but for me it seems like a very practical benchmark to understand what lane it's in when otherwise not paying attention to Neo's or paleos.
I wonder if the target market for this thing is like my aged parents who basically use their phones for everything, even when it's obviously a painful experience, but they don't want a full-blown computer.
Kind of like a more intentional version of an ipad w/keyboard? I'm not sure I fully understand what the diff is between ipad w/keyboard and the Neo.. price, I suppose.
And macOS. The Neo's 8GiB of ram has been talked to death, but at the end of the day, iPadOS isn't macOS, which has a window manager with floating windows vs iPadOS' (hacky) side-by-side mode.
> if someone uses a gun..why is the company providing the gun not held accountable here?
They absolutely can be held accountable. The Protection of Lawful Commerce in Arms Act (PLCAA) has carve-outs for: negligent entrustment - when a dealer or manufacturer provides a firearm knowing it will be used for a crime; negligence per se - when a seller knowingly violated state or federal laws in the sale or marketing of the product (and that sale was a proximate cause of the harm); defects in design; breach of contract/warranty.
However, selling a product for lawful use, whether a gun, truck, or Internet connectivity, does not make the seller liable if the consumer decides to use that otherwise lawful product for crimes. There has to be some assumption of agency (and liability) on the part of the individual who is clearing ethical/moral hurdles to do wrong.
I don't see how this unanimous court decision conflicts with that theory in the context of the ISP - in fact, I think it's a reinforcement of some common sense.
Click Start, type something to bring up search results, click the kabob in the top right (...), click "Search Settings", disable "Show search highlights".
Ha! This is the first time I've even tried the Win10 Search bar in months after constant disappointment from it, and it doesn't even load for me nowadays:
There are two wolves that live inside the Windows brain - Apple and Linux (yes, two wolves).
On one hand, Windows has pressure to be something that "just works" like an iPad used to be - users can't screw it up. This is what enterprises want for the daily drivers of their massive user populations.
OTOH, Windows has pressure to be this highly customizable tool for savvy high-agency individuals. This is what we all want.
I can empathize with both needs, for sure, but it is a constant war. They're doing alright, considering.
Hindsight's how we all learn. Doing it over again, I'm sure those guys would have done things differently. Any team would be crazy today to not be more prudent in how they operate.
Sure, the part I thought was "easy to say in hindsight" was:
> I would not have taken this gig unless you had verbal confirmation that the Sheriff knows about it and has signed off.
We don't know that! We don't know what we would have done in that scenario, especially in the context of a thread about the very outcome one's supposed foresight would have prevented.
> Research suggests that people still exhibit the hindsight bias even when they are aware of it or possess the intention of eradicating it. [...] The only observable way to decrease hindsight bias in testing is to have the participant think about how alternative hypotheses could be correct.
So here's an alternative hypothesis:
"Hey, do you reckon we should clear this with the county first? The sheriff might come and arrest us on the basis that nobody told him we were going to break into the courthouse"
"Nah, don't worry about it, I've done this sort of thing hundreds of times. And besides, the state has superiority over the county anyway, so even if we get caught which let's face it we won't because we're leet hackers and very incognito... the sheriff won't have any power to do anything to us as soon as we tell him it's authorised by the state"
This is not an "obvious in hindsight" thing, and its also something that was discussed in the physical penetration testing community long before 2019 when this happened. Everyone makes mistakes, and they were legally in the right, but most in physical pentesting know: You're probably going to make someone look like a fool during your work, and your CYA needs to be rock solid to not just absolve the illegality of what you're doing, but the immediate consequences of that newly minted fool also having an ego and authority. A piece of paper will not save your life against a trigger-happy rookie cop in a dark hallway at 2am, even if it might ruin his after you're already dead.
And, by the way: The Sheriff was in the wrong and some of what happened to these pentesters should never have happened. But, this case is not nearly as clear-cut as some one-sided storytelling suggests it is. When the Sheriff called the contact numbers at the State of Iowa, one person didn't answer, and a second person said that they "did not believe the men had permission to conduct physical intrusion." One of the pentesters also blew lightly positive for alcohol. One of the men was from Florida, and the second from Seattle, working for a security firm out of Colorado. That's suspicion enough to end up in jail overnight.
The fact that it went on longer than that more-so gets at the real story. The State was exercising an authority they had, maybe for the first time, against a security force that not only didn't know they were exercising it, but didn't realize they even had the authority in the first place. These guys got caught in the middle. The distribution of blame is pretty significant: The State should have informed the local security, but didn't. The State should have had contacts on-call during the intrusion, but didn't. Coalfire should have confirmed all of this in the interest of protecting their employees, but didn't. The testers shouldn't have been drinking beforehand, but did. The Sheriff should have dropped the matter the next day, but didn't. Sure, some of this is 20-20 hindsight, but taken in its entirety there were a lot of balls dropped, and it paints a picture of a state government that has some box to check for compliance, doesn't care how it gets checked or what gets found, and a security firm that isn't conducting their penetration tests responsibly.
Exactly. If I were in that position I would have simply learned from what happens in the future. In the rare instance that there was a negative outcome, I would just inform my previous self so that I could retroactively ensure that that outcome had not occurred.
It is through this simple system that I can confidently say that the content of this article that I am reading today in 2026 had/will have an impact on what I would have done in 2019
Definitely some things could have been done a bit differently. I get that they want to keep staff in the dark, and even beat cops, but it seems reasonable and prudent to have the highest level of local law enforcement brought into the loop in planning red team exercises. The likelihood is high that the team will interface with law enforcement. The escalation path within the enforcement side of the state regulatory machine should be cleared in advance.
I think the takeaway for security teams is that you shouldn't let the customer "authorize" what is otherwise criminal activity warranting a police response without getting some air cover from the enforcement side. Coordinating that is the customer's burden to bear and that cover should be secured before letting them hand-wave away the risks with a "just have the police call me and I'll clear it all up". In hindsight only, when you look at it like that, the security team was not covering their ass appropriately. In a perfect world, you'd assume there's some better planning and communication going on behind the curtain. In the real world, you need more than the flimsy "guarantee" of calling a guy who knows a guy in the middle of the night. At the very least, that get out of jail free card should have had as signatories judiciary representation and enforcement representation (e.g. sheriff).
reply