Are you like building a mobile app or have 100k tests or is it just super slow?
Are you like building a mobile app or have 100k tests or is it just super slow?
It took me an embarrassingly long time to figure out what a “Pugina” was, should have just clicked the link
I feel like this section is rather disingenuous for the article author to just drop without mentioning that this is how all machine learning models are trained. The idea is that now (and for the next year or whatever) it’s trained manually until the system is good enough to do it on its own with a good enough accuracy rating to not lose money.
Now, since Amazon is shuttering this, it’s totally possible that they determined they’d need too many years of training data to break even, but at the very least this is standard industry practice for any machine learning model.
The containers still run an OS, have proprietary application code on them, and have memory that probably contains other user’s data in it. Not saying it’s likely, but containers don’t really fix much in the way of gaining privileged access to steal information.
In that case, it’ll steal someone else’s secrets!
There’s no way the model has access to that information, though.
Google’s important product must have proper scoped secret management, not just environment variables or similar.
Oof, how did it end up going?
1password does this, too and it’s magical. I’ve had my SMS go to my browser via Google Messages for a while, but it’s so much easier to just auto-fill it instead of copy/paste
It sounds like the error message a dev shop makes up when the PM doesn’t specify what the error message should be…
Cries in “let’s outsource this, it should be cheaper and faster”
React ugh, everybody is using NextJs these da- …oh, what’s that? We’ve moved on already?
With their naming standards, it would probably be called “U^2SB, Universal Super Fast Serial Bus”
More like nine judgemental workers looking at your telemetry data, but the acronym doesn’t work…
Not sure if sarcasm, but the article is actually super insightful into a few different methods bad actors could use to accomplish the same feat (short of giving them a formula, from what I can read, but I’m not a battery maker)