I had been working on a post but the concerns I was trying to express and analyze are so far outside my bailiwick that the post was just going to be a demonstration of the Dunning Kruger effect.
I am not a computer expert.
I'm a Mac user...which, I think, is the opposite. Aside from my TI99, my first computer was an old Macintosh that I pulled out of a dumpster in '98. I've gone on via inertia since then buying iMacs.
Computers are black box technology to me, the only thing I know about how they work is that they need good industrial design to keep the magic smoke inside so they can access the ley lines that make up the internet.
I'm trying to improve, but there's a lotta shit going down right now that's tech related. So I sent some questions to one of the blog's Crack Team of Science Babes, but the answers I got were non-specific and unhelpful.
However, several of my readers are tech savvy, to the point they make their living in IT or computer engineering.
I have some questions...
I am, as I've admitted....(looks both ways and whispers in shame)...a Mac user. I have very little experience with PCs and Macs are not tinker friendly...at all. I'm not, realistically, going to be aasembling one anytime soon. Can anyone recommend a good pre-built gaming/streaming PC?
The cyber security threats we are seeing now are on a par with what was predicted for Y2K. But seem to be more substantive than Gary North ranting in his basement. Some of you do this for a living. Do you see any way to truly mitigate these vulnerabilities? Prior to the 1980s, many of these crucial functions that are so susceptible to digital manipulation now were handled quite adequately with manually operated or mechanically regulated systems. Is it just cost savings driving this transition to vulnerability or were regulations involved? Given that we used to operate refineries and power-plants without these vulnerabilities are there any hard engineering reasons distinct from political or cost/benefit reasons that we could not install manual/mechanical backups to keep from having some hacker blow up our stuff?
Uhhh.
Does this have any implications for non-Russians going forward? The internet was designed to survive, or at least fail gracefully in the event of a nuclear war. The cybersecurity issues and cloud storage seem to be glaring vulnerabilities that can't be mitigated by the users since at least in the case of the "cloud" all aspects of security, durability, and access privileges are completely out of the hands of the user. Is there anything to be done realistically other than "Buy your own damned servers and maintain them!" I confess I'm a little skeptical of the cloud on principle, but I've got a degree in History which doesn't give me a broad and deep understanding of the problem...if there IS a problem.
1
If you have a good handle on the happenings in Apple universe, you might want to take a look at running Windows on a Mac.
Otherwise, buying Dell consistently brought about a better result than buying HP for me. But you know, chasing a high-end PC seems like a whole another subculture.
Posted by: Pete Zaitcev at Fri Mar 18 21:23:10 2022 (LZ7Bg)
2
I'll second Dell. My current computer... the one at Pond Central... was a no-longer top of the line system from them, and other than one ridiculously busy relationship chart, it's handled everything I've thrown at it. It played The Outer Worlds at max everything
(most of the time) in the 80-100fps range.
KSP is the only thing that's really made it chug, but that was with a 200+ part ship that caused powerful systems to weep and lesser boxes to play the game in spf, not fps.
Plus, y'know... pretty blue lights.
Posted by: Wonderduck at Fri Mar 18 21:50:27 2022 (DB9Lx)
3
I'll put these in my weekend Q&A column.
Posted by: Pixy Misa at Sat Mar 19 01:47:34 2022 (PiXy!)
4
I'm having trouble posting a 1273 word/7827 character reply.
This is a test.
Posted by: PatBuckman at Sat Mar 19 21:12:08 2022 (r9O5h)
5
There are still a LOT of chips made in the US, I have a couple friends that were bunny suits for a living in portland and ca. I know Intel has been scrambling to bring more fabs online, and there are a lot of other chip makers still in the US
https://www.axios.com/computer-chips-manufacturing-america-10dcfe13-64f3-4ea9-ad4a-cb189a00429a.html
the most advanced chips all need tech from ASML, a Dutch company that makes the equipment to make the chips.
I use a mac for personal stuff, but I've used a lot of Dells, hand built PCs, and other brands (maingear, boxx, HP. Fujitsu, Sony, etc) for work over the years. the Dells are the most solid.
Posted by: punchyninja at Sat Mar 19 21:18:20 2022 (qnek2)
6
I do not understand just why the neon is being consumed. It really should not be. Of course the last time I worked with a gas laser was in the 1980s, so I may be 40 years out of date. But back then the gas was in a tube with two 100% glass (tilted) ends. It lasted basically forever.
Posted by: Pete Zaitcev at Sat Mar 19 23:30:07 2022 (LZ7Bg)
7
Hardware: These are all basically electronic circuits. You can
learn more by studying electrical engineering, or going to a school
for electronics technicians.
Electrical
engineering can work differently at higher frequencies than it does
at zero frequency (DC). At DC, you have your usual basic
electrical circuit theory, where you can put wires on a battery and a
light bulb, and make the lightbulb light up. Computers are one
of the many electrical engineering applications that involve periodic
signals at higher frequencies.
Hardware electrical
engineering makes a system out of a bunch of circuits, and electrical
engineers either make components, or buy components to arrange in
circuits and systems.
Printed circuit boards (PCBs) are a
composite laminate sandwich of metal and something like fiberglass.
The engineers call the fiberglass layer a 'dielectric', and you can
think of it as an insulator. The traces are made by etching
some of the metal layers. The PCBs in computers use many
many layers, and are complicated, difficult to design, and assume
that very specific components are available to attach to the PCBs,
and complete the working circuits/systems. These components are
ordered through places like Digikey, and Mouser, and sometimes you
can't assemble more boards if you can't get more of the same
component, like if the manufacturer has production issues or quits
making them.
One type of component that gets attached to a
PCB is an integrated circuit. (IC). An IC is made by etching
silicon wafers. A specific model of IC has a known set of
circuits on the wafer, and a known set of connections in a pin
out format. Some designs of IC circuit can be made on different
'processes', at very different resolutions. If the
available capacity for process A is maxed out, in theory you
can make some more with unused capacity for process B. Adding
IC process capacity requires building a building to house some
capital equipment that has a very long lead time. These ICs are
the chips being discussed.
The capacity for the best
processes is very limited, there are very few locations producing the
ICs for the latest CPUs. A CPU is a module that plugs into the
motherboard (a very fancy, expsensive hard to get PCB). Some
generations of CPU were PCBs with ICs on them.
There is an
extremely broad need for ICs that are made using simpler, older
processes. Here's a random one I
found.
https://www.digikey.com/en/products/detail/texas-instruments/SN74HCT04DR/276848
This
is built on a process that can be at least thirty years old.
You can buy 8 per dollar if you buy 2500. In quantity, you buy
them packaged so that your fancy PCB assembly machine can feed them
easily. It may show signs of a chip shortage. 35 days
until new inventory, c 750 in stock, and c 1400 in stock for a
suggested substitute.
All of these processes are
extremely delicate and sensitive. Some of them may have
alternative possibilities besides Neon, or the other expensive stuff,
but switching over an IC assembly line would at take time before it
was producing volume at quality again.
IC and PCB supplies
are important if you are building new stuff, or repairing or
replacing old stuff. If you have a good inventory of spare
parts, this wouldn't hurt you. So, this is basically a slice of
the broader issue of JIT making things fragile.
Beyond
that, I dunno.
Posted by: PatBuckman at Wed Mar 23 20:20:49 2022 (r9O5h)
8
Software:
Actually, this is a bunch of fads, plus labor
cost, and a different set of skills.
You have fads
that are purely Big Tech. You also have broader
business/MBA/industry fads. Anything that a journalist
can boil down into a phrase, and speak loudly about without having
any understanding, can become an industry wide fad. Internet of
Things is probably such a fad.
These industrial
control systems were put in for what seemed to be attractive
reasons. a) solid state electrical systems often break
less than mechanical systems, so potentially less down time from
repair b) range of behaviors that can be adjusted c) more data
collection d) not requiring people to be present for the system to be
managed.
There are a bunch of flavors of problems.
One, there are two different fields involved, and they do not think
in a very similar way. IT security, and plant
operations/industrial controls engineering are extremely different.
Walt Boyes, a controls industry trade journalist, used to say that he
knew how to teach people do secure this stuff, and that it was easier
to teach it to plant engineers. Two, a lot of implementations
are carried out without having a full understanding of everything
involved. Someone exposed to the basics of both IT security and
industrial controls engineering is rare; often implementors, or
organizations, are missing one or both skillsets. Management
making the decision tends to be ignorant of the real complexity, and
risks. Three, organizations can blindly push through changes in
IT that cause additional vulnerability to the industrial controls.
Suppose your industrial controls in scores of locations are
controlled from three sites, properly air gapped from the internet.
Supposed your HR decides that people are working from home, starting
yesterday. Can you say 'insecurely designed control system now
accessible through whatever internet gateway IT implemented in the
middle of the night before start of business today'?
So,
we were doing bad stuff before, but the lockdown was insane
bullshit. We might not be fixing as long as management contains
woke people who consume mainstream media.
IT
practice, generally, has a lot of security vulnerability.
PArtly because management is able to demand that things be done, and
fads be followed, partly because the people being trained often
have holes in their understanding.
Industrial controls?
Hardware isn't free, and you have to program the software. You
are typically tied to a vendor who sells stuff that can be programmed
using stuff that one of your engineers understands. PLCs are
one of the items that you may purchase for this. PLC is one
search term, and LabVIEW is another. LabVIEW is a weird
programming language that has some industrial applications along
these lines.
There are also fundamental technical
vulnerabilities driven by Big Tech fads. As opposed to fads
within the whatever widget industry, of MBAs wanting /their/ company
to be the first to implement $technical_buzzword.
There
are still a lot of mechanical, etc., controls being used.
I've heard tell, inside the past five-ten years, of folks who still
have pneaumatic controls in place. a) wholesale replacement
with a different system is typically expensive, on
the budgets that these places have for that b) If you change types of
control, your work force needs different skills. This explains
why old methods are sometimes retained, but I meant it to explain why
changing back would at least take time.
i) cloud
always struck me as being dumbass ii) anyone who bought cloud
based PLCs probably deserves to hang. Cloud was a pretty
significant fad, so there are a bunch of people who are doing that,
and should have instead bought their own servers, and paid for the
necessary manpower.
I want to say that people running
factories of course had enough sense not to... Hearsay
suggests that this is manifestly untrue. The best effort to
understand broader trends in American society, suggests we are
maximum retard, independently of business hearsay.
tl;dr:
We have been f&cking up a bunch of stuff for decades. It
took a lot of effort to pile up all of this staggering
incompetence. I didn't even mention that our foreign
policy weakness substantially contributed to hostile foreign actor
willingness to give us metaphorical swirlies. We will not
quickly be addressing matters.
Posted by: PatBuckman at Wed Mar 23 20:21:43 2022 (r9O5h)
Hide Comments
| Add Comment