They are also IR controlled. A lot of them have a little window on the front of the unit, and an array of transmitters in the ceiling.
They are also IR controlled. A lot of them have a little window on the front of the unit, and an array of transmitters in the ceiling.
Take what HR says with a grain of salt.
If they’re gaming H1B, They’re not gonna say “yeah we’re faking it to get cheap indentured immigrants to work for us”.
Could a hypothetical attacker not just get you to visit a webpage, or an image embedded in another, or even a speculatively loaded URL by your browser. Then from the v6 address of the connection, directly attack that address hoping for a misconfiguration of your router (which is probable, as most of them are in the dumbest ways)
Vs v4, where the attacker just sees either your routers IP address (and then has to hope the router has a vulnerability or a port forward) or increasingly gets the IP address of the CGNAT block which might have another 1000 routers behind it.
Unless you’re aggressively rotating through your v6 address space, you’ve now given advertisers and data brokers a pretty accurate unique identifier of you. A much more prevalent “attack” vector.
If you still do the sizing (it’s not entirely wasted as it’s a reasonably effective tool to gauge understanding across the team), This can still be done without the artificial time boxing.
“How much work have we done in the last two weeks?” Just look at all the stories closed in the last two weeks. Easy.
“When will X be delivered?” Look at X and all its dependencies, add up all the points, and guesstimate the time equivalence.
Kanban isn’t a free for all, you still need structure and some planning. But you take most of that away from the do-ers and let them do what they do best… do.
I thought everyone decided “jfgi” in online discourse was a toxic years ago. It’s the same attitude as :
chemtrails make you sick!
How so?
go do your own research
If you’re going report on something, provide a little more information than just “no”. It’s more helpful, better for the community, and in 5 years time when the facts are different, there’ll still be a reference of what was factual in the past.
I’m old, I have other shit to do, and I don’t have the time. If I’m writing code, I’m doing it because there is a problem that needs a solution. Either solving someone else’s ‘problems’ for $$$, or an actual problem at home.
If it’s a short term problem like “reorganising some folders” I’m not going to (re)learn another language. I’m going to smash it out in 30mins with whatever will get the job done the quickest, then get back to doing something more important.
If it’s an ongoing problem, I’m going to solve it in the most sustainable way possible. I might fix the problem now but 100% someone’s going to drop support or change an API in 2 years time and it’ll break. Sure, doing it in Chicken would be fun. But the odds are, I won’t remember half the shit I learned 2 years later. It’ll be unmaintainable. A forever grind of learning, fixing, forgetting.
So without a commercial driver to actively invest in Lisps, there’s no point. It’s not profitable and It doesn’t solve any problems other tools can. Without the freedom youth brings, I don’t have the time to do it “for fun”.
I love lisp. Well, scheme and less so clojure. I don’t know why. Is it macros? Is it the simplicity? Or is it just nostalgia from learning it during a time in my life.
But I just can’t find a place for it in my life.
It’s not job material, effectively nobody uses it. It doesn’t solve basic problems with ease like Python does.
And because of this, anything I do in it is nothing more than a toy. As soon as i put it down, I have no hope of picking it up or maintaining it in 6,12,24 months later.
A toy I spend 2 weeks in absolute joy, but as soon as life gets in the way it is dead.
Move to NZ. It’s nearly all c# here.
Ditto… ish.
In my dream I mixed up some constraints of the real-world system. I still came up with an elegant solution that would have worked if the dreams constraints were true. Except they weren’t and the solution was useless.
Bonus was the dream-solution exposed a “front door” so to speak on the real problem and I felt dumb that I even spent 5 minutes thinking about it.
A bottle is hard to rinse lid or not.
Wouldn’t you just rough chop the material and then rinse it?
Plug a USB-C screen into a USB-C port. Will it work?
Maybe? If the manufacturer has wired the port to the GPU for DP/HDMI alt mode it might.
… but you’ve used this display on this laptop before?
Try another port! Nope, still nothing.
Maybe it’s the cable? Rummage around through your cables and try a few out. Hope you don’t have any from the 2010s because there’s a good chance they’ll ruin your device.
The screen works! But performance is terrible, why? It’s running in DisplayLink mode.
You give up and suffer through.
This is too close to home.
I used to use ANSI, but then moved to England and bought a laptop and returned it because of the “weird” ISO keyboard, then forever bought dell because I could customise it.
Moved back to ANSIland, but will still probably just buy dell.
Is this just the cost per raw Watt produced?
Is it a fair comparison vs conventional fuel-based power (coal/nuclear)?
Ie: if you wanted to build a plant capable of producing continuously, 24 hours a day, you would need some multiple of solar panels to produce an excess during daylight, and storage.
Not that drastic drops in solar costs aren’t bad, just what would the cost-per-watt be if you had to power an average city on just solar for a year?
I can think of applications of Weta’s MASSIVE in games.
They do a lot of work on mocap technology, which is used in game dev.
And sure, movies run at minutes per frame, but reusing the knowledge and skills developed during the production of them can be applied to game development. It’s not 1:1, but there’s transferable skills. And there’s always emerging technology. Take Gaussian Splatting, that potentially could take realistic low-fps CGI scenes and make them realtime.
Weta is researching and building (amongst other things) graphics processing technologies.
Being able to take cutting edge technologies from the film industry, optimising them and selling them as “click and go” solutions in Unity would be a huge win.
It’s America, so the answer is probably “No”.
Do you not have consumer protection laws?
We’ve had digital price tags for decades. But you couldn’t do this in NZ. Stores are obligated to sell you a product at the price they advertise it for AND have a reasonable quantity of units at that price… you couldn’t sell 1 TV for $1.
So these systems would need to track what price you saw it at.
(Caveat: Our stores are still cunts and have been found to overcharge people)