May 16, 2021
This is a Worthy Video.
Issac Arthur looks at less grounded scifi weapons and comes to some interesting conclusions.
Watch on Bitchute
Isaac Arthur mentions a light-saber video. It is here and it is awesome! There are follow-ups here and here, where they demonstrate that Styrofoam is a sub optimal armor material, and mobile homes are weak to light sabers as well as tornadoes.
UPDATE:
There is a moment in the video where Mr. Arthur predicts that hacking will be viable for only a short period as cybersecurity is becoming much better.
Colonial says "Hi!".
Admittedly he's talking long term but I was under the impression that anything can get hacked. I am curious what IT experts think of his rather optimistic prediction.
Posted by: The Brickmuppet at
06:40 PM
| Comments (4)
| Add Comment
Post contains 123 words, total size 2 kb.
1
It is the classic race between projectile versus armor. So far, there has been nothing to suggest that armor can sustain anything beyond momentary periods of superiority when it comes to IT.
Posted by: cxt217 at Sun May 16 22:19:23 2021 (4i7w0)
2
The key problem with security isn't systems, it's people. People are dumb.
Posted by: Pixy Misa at Mon May 17 01:09:51 2021 (PiXy!)
3
Yeah, what they said, but also, security inevitably ends up being added to existing systems rather than being present from the start, leaving gaps that can be exploited years later. Particularly in startups, security (and, honestly, process in general) is seen as a barrier to being "lean" and "agile", and a security team formed later won't even know about all the shortcuts that were taken until something explodes.
When WebTV was moved onto the brand new Silicon Valley campus, it was our first experience with being on the real Microsoft corporate network, and it was riddled with malware. You could not successfully download all the patches for a brand-new Windows machine before it was compromised; you had to install updates from a CD before connecting to the network.
Why was it so bad? Because engineers all over the company had desktop machines with a second ethernet port plugged directly into the public Internet for convenient data center access, and many of them were "accidentally" configured as routers, bypassing the firewalls.
-j
When WebTV was moved onto the brand new Silicon Valley campus, it was our first experience with being on the real Microsoft corporate network, and it was riddled with malware. You could not successfully download all the patches for a brand-new Windows machine before it was compromised; you had to install updates from a CD before connecting to the network.
Why was it so bad? Because engineers all over the company had desktop machines with a second ethernet port plugged directly into the public Internet for convenient data center access, and many of them were "accidentally" configured as routers, bypassing the firewalls.
-j
Posted by: J Greely at Mon May 17 10:16:39 2021 (ZlYZd)
4
Security issues result from more basic issues within the field of CS/programming.
There is an aerospace engineering analogy. This discipline is fairly young, as an engineering discipline, and only really goes back to Orville and Wilbur Wright.
In the 1950s, there were a lot of designers with experience building sub sonic fighters during WWII. They made some super sonic fighter designs that were 'very unforgiving aircraft', aka 'lawn darts'. This was partly because they didn't have enough data about fluid flow at those speeds, and hence didn't really know how to design supersonic fighters.
There is a fundamental problem within aeronautical engineering, the fluid mechanics equations suck to work with*, and you have to have experimental data to do anything new and interesting. When you are moving into a new area of fluid behavior, like supersonics then, or maybe hypersonics now, it is not always clear which rules of thumb are no longer valid.
Whether you date the real start of CS to the 1940s or the 1960s, it is younger** than aerospace engineering, and definitely less mature as an engineering discipline. (Basically, programmers who cannot think as engineers, and engineers who cannot think as programmers are hard to sort from the actual software programmer-engineers who understand which tools and techniques are reliable, and which ones are not.) A defective program does not collapse of its own weight the way a defective structure does. In some cases you have to study the program, and understand software testing, to identify even catastrophically bad programs. So the customers do not have the obvious problems that they could point to for civil engineering, mechanical engineering, and aerospace engineering, that the public was able to use as a basic for forcing those engineers to develop a certain level of discipline maturity.
As of yet, it is not certain if these basic problems are fundamental problems the way fluid mechanics challenges are for aeronautical engineering.
If we could prove that they are fundamental problems, we could then prove that we would never achieve perfect security.
There are bits of theory where researchers are working on theory that could in theory deliver defect free code when measured against a specification. But, there would still be problems in defining the specification, and in finding experts that one could trust to properly define the specification.
We don't understand all of the problems with software designed according to best currently known principles. Therefore, unknown security defects to be discovered in the future, if we learn more. So security is catching up. If you do not always have the people in place working on catching up, you will not stay current. And software design is really complicated, so when you prepare the team on a limited budget, security people, and trying to prepare security are tempting goals to remove from the team and from the scope of the project. Issue is, building in security afterwards is never as good. If you want secure software, any effort spent reducing the complexity of the fundamental problem is worthwhile, because it allows designing security into the system from the beginning, and reduces the ongoing maintenance costs. The people with money do not really grok this, so everyone tends to specify absurd levels of complexity in to the basic specification for feature requirements.
These other guys understand much better than I do, and have hit the essentials more succinctly. I expect that the wild eyed crazies promising an achievable level of assurance are best looking at those 'meeting specification' theoretical areas. It runs right through those questions of measurable, important, information loss, information delays, and organizational/people issues that make the general case of technocracy such a reliably bad idea.
*Look up the Navier-Stokes equations. That prize for finding a general closed form solution is really significant, and has not been awarded last I heard. And Navier-Stokes are a simplification of the more general cases of continuum fluid flow, and continuum assumptions do not hold for all cases of fluid flow of interest to Aerospace Engineers.
**Pre-Wright CS history tends to have parallels in pre-Wright Aerospace engineering history. At the same level of preliminary development, Aerospace is slightly older than CS. Unless you take discrete mathematics or multi-step calculations as the earliest CS, and look back into mathematical history for stuff that may possibly predate kites.
There is an aerospace engineering analogy. This discipline is fairly young, as an engineering discipline, and only really goes back to Orville and Wilbur Wright.
In the 1950s, there were a lot of designers with experience building sub sonic fighters during WWII. They made some super sonic fighter designs that were 'very unforgiving aircraft', aka 'lawn darts'. This was partly because they didn't have enough data about fluid flow at those speeds, and hence didn't really know how to design supersonic fighters.
There is a fundamental problem within aeronautical engineering, the fluid mechanics equations suck to work with*, and you have to have experimental data to do anything new and interesting. When you are moving into a new area of fluid behavior, like supersonics then, or maybe hypersonics now, it is not always clear which rules of thumb are no longer valid.
Whether you date the real start of CS to the 1940s or the 1960s, it is younger** than aerospace engineering, and definitely less mature as an engineering discipline. (Basically, programmers who cannot think as engineers, and engineers who cannot think as programmers are hard to sort from the actual software programmer-engineers who understand which tools and techniques are reliable, and which ones are not.) A defective program does not collapse of its own weight the way a defective structure does. In some cases you have to study the program, and understand software testing, to identify even catastrophically bad programs. So the customers do not have the obvious problems that they could point to for civil engineering, mechanical engineering, and aerospace engineering, that the public was able to use as a basic for forcing those engineers to develop a certain level of discipline maturity.
As of yet, it is not certain if these basic problems are fundamental problems the way fluid mechanics challenges are for aeronautical engineering.
If we could prove that they are fundamental problems, we could then prove that we would never achieve perfect security.
There are bits of theory where researchers are working on theory that could in theory deliver defect free code when measured against a specification. But, there would still be problems in defining the specification, and in finding experts that one could trust to properly define the specification.
We don't understand all of the problems with software designed according to best currently known principles. Therefore, unknown security defects to be discovered in the future, if we learn more. So security is catching up. If you do not always have the people in place working on catching up, you will not stay current. And software design is really complicated, so when you prepare the team on a limited budget, security people, and trying to prepare security are tempting goals to remove from the team and from the scope of the project. Issue is, building in security afterwards is never as good. If you want secure software, any effort spent reducing the complexity of the fundamental problem is worthwhile, because it allows designing security into the system from the beginning, and reduces the ongoing maintenance costs. The people with money do not really grok this, so everyone tends to specify absurd levels of complexity in to the basic specification for feature requirements.
These other guys understand much better than I do, and have hit the essentials more succinctly. I expect that the wild eyed crazies promising an achievable level of assurance are best looking at those 'meeting specification' theoretical areas. It runs right through those questions of measurable, important, information loss, information delays, and organizational/people issues that make the general case of technocracy such a reliably bad idea.
*Look up the Navier-Stokes equations. That prize for finding a general closed form solution is really significant, and has not been awarded last I heard. And Navier-Stokes are a simplification of the more general cases of continuum fluid flow, and continuum assumptions do not hold for all cases of fluid flow of interest to Aerospace Engineers.
**Pre-Wright CS history tends to have parallels in pre-Wright Aerospace engineering history. At the same level of preliminary development, Aerospace is slightly older than CS. Unless you take discrete mathematics or multi-step calculations as the earliest CS, and look back into mathematical history for stuff that may possibly predate kites.
Posted by: PatBuckman at Wed May 19 11:32:52 2021 (6y7dz)
39kb generated in CPU 0.1048, elapsed 0.3444 seconds.
71 queries taking 0.3356 seconds, 369 records returned.
Powered by Minx 1.1.6c-pink.
71 queries taking 0.3356 seconds, 369 records returned.
Powered by Minx 1.1.6c-pink.