Compassionate leadership is what differentiates good from great leaders during crisis. Society-changing ideas form through a three-stage process, argues author Michael Bhaskar. High Culture. By taking Satan out of the religious context, storytellers explored the nature of sin in new ways. Starts With A Bang. The exponential growth rate that Moore picked up in the s was driving technological progress since the beginning of the century.
It is especially insightful if one wants to understand how technological progress mattered as a driver of social change. The extension of the time frame also makes clear how our modern computers evolved. It is an insightful way of understanding that the computer age really is the successor to the Industrial Revolution. One could also view the previous graph as a function of price instead of calculations per second; in this view you would find an exponentially decreasing price for a given product quality over years.
The implication of this rapid simultaneous improvement in quality and decrease of the product price is that, according to a detailed discussion on reddit here , a current laptop May has about the same computing power as the most powerful computer on earth in the mid s. In the chart shown we see the price changes in goods and services in the United States from , measured as the percentage price change since Positive values indicate an increase in prices since , and negative values represent a price decline.
Here we see a distinct divide between consumer durables and technologies which have typically seen a price decline , and service-based purchases which have increased in price. Examples of service-based roles such as nursing, healthcare, childcare and education have experienced little productivity growth relative to manufacturing sectors which have seen continued improvements through technological innovation.
In order to retain employees in service-based roles, salaries have risen in order to remain competitive with industrial sectors; this increase in pay has occurred despite minimal gains in productivity. This may in part explain why the cost of education, healthcare and other services have risen faster than the general rate of inflation.
The cost to keep the machine running also matters. Electrical efficiency measures the computational capacity per unit of energy, and it is also important with respect to the environmental impact that energy production has.
The progress in this respect has been tremendous: researchers found that over the last six decades the energy demand for a fixed computational load halved every 18 months.
In this chart we see the computing efficiency of various processors over time. Here, computing efficiency is measured as the number of watts a measure of electrical power needed to carry out a million instructions per second Watts per MIPS. Looking at these two picturesit becomes immediately clear how fast technological progress increased the storage capacity.
Considering the time since the introduction of the IBM in , the growth rate of storage capacity has not been as constant as for the other measures discussed before. Early on, technological revolutions boosted the capacity stepwise and not linearly. This notion comes from our collaborations with author David Moschella. We hear a lot about machine learning and deep learning and think of them as subsets of AI. These models improve as they get more data and iterate over time.
The right side of the chart above shows the two broad elements of AI. The point we want to make here is that much of the activity in AI today is focused on building and training models. And this is mostly happening in the cloud. But we think AI inference will bring the most exciting innovations in the coming years.
Inference is the deployment of the model, taking real-time data from sensors, processing data locally, applying the training that has been developed in the cloud and making micro-adjustments in real time. We love car examples and observing Tesla is instructive and a good model as to how the edge may evolve. So think about an algorithm that optimizes the performance and safety of a car on a turn. The model takes inputs with data on friction, road conditions, angles of the tires, tire wear, tire pressure and the like.
Then the intelligence from this model goes into an inference engine, which is a chip running software, that goes into a car and gets data from sensors and makes micro adjustments in real time on steering and braking and the like.
But it can choose to store certain data selectively if needed to send back to the cloud and further train the model. For example, if an animal runs into the road during slick conditions, maybe Tesla persists that data snapshot, sends it back to the cloud, combines it with other data and further perfects the model to improve safety.
This is just one example of thousands of AI inference use cases that will further develop in the coming decade. This conceptual chart below shows percent of spend over time on modeling versus inference. And you can see some of the applications that get attention today and how these apps will mature over time as inference becomes more mainstream.
Modeling will continue to be important. But inference, we think, is where the rubber meets the road, as shown in the previous example. And in the middle of the graphic we show the industries, which will all be transformed by these trends. One other point on that: Moschella in his book explains why historically, vertical industries remained pretty stovepiped from each other.
And expertise tended to reside and stay within that industry and companies, for the most part, stuck to their respective swim lanes. But today we see so many examples of tech giants entering other industries. Amazon entering grocery, media and healthcare, Apple in finance and EV, Tesla eyeing insurance: There are many examples of tech giants crossing traditional industry boundaries and the enabler is data. Auto manufacturers over time will have better data than insurance companies for example.
DeFi or decentralized finance or platforms using the blockchain will continue to improve with AI and disrupt traditional payment systems — and on and on. Fundamental science limits exist because the laws of science can't be changed. Space ships can never go faster than the speed of light, no matter how clever the engineers.
The gravity of a planet can never be turned off. A single elevator shaft can never be made infinitely tall because eventually its cable is not strong enough to support its own weight. Beyond fundamental laws, the lack of available resources also places limits on technology. For instance, if you tried to build a skyscraper out of solid gold that reached the moon, you would use up all of the gold in earth's crust long before reaching the moon.
Aside from the limits of raw physical resources, there are also limits on the amount of time, money, and energy a society is able to devote to a project. Building a bridge from New York to Paris is physically possible and requires no more raw resources than is readily available.
But such a project will probably never be completed because the building of miles of towers and spans is more than any country can afford. Technological advances are driven by humans and the human intelligence has limits While computers can greatly accelerate the speed of raw calculations, they cannot think creatively.
Innovation is driven by human intelligence and creativity; not by raw processing power. Every piece of software running on a computer had to first be designed and programmed by a human. Every physical law that a computer code is simulating had to first be derived by humans.
For example, computerized wind tunnel simulations can help an airplane designer optimize the aerodynamics of his plane without needing to build hundreds of prototypes. But the laws of aerodynamics had to first be discovered by a human and inputted into the computer before it could run its simulations. A computer can't do anything new. It just does faster what a human could do with a pencil and paper or a real wind tunnel. If a computer does something clever, it's because a clever human designed it to do that.
Because technology advances are driven by humans intelligently using tools, such advances are limited by the human brain.
Consider technological advances to be like apples on a tall tree. The low hanging fruit is easily and quickly picked. But the higher and higher levels of fruit are increasingly harder to reach. Virtually everyone who has finished high school can understand and apply a breakthrough from the 's such as Newton's law of gravity. But very few people can understand and apply a breakthrough from the early 's, such as Einstein's gravitational field equations.
Developing an even more advanced theory of gravity than Einstein's would require first understanding Einstein's theory.
0コメント