Nootropics, aka, smart drugs, are supplements that enhance your cognitive functions like memory and creativity. Caffeine is one of the most common nootropic and it is found to increase alertness and attention. Another compound that is shown be psychoactive is L-Theanine. It is usually found in tea and has been shown to mitigate the negative effects of caffeine. L-Theanine seems to increase brain concentrations of serotonin and dopamine.
Unlike NZT from the movie Limitless, nootropics in real life have a much milder effect. And this comes with a catch. Most of these drugs have recorded side effect both short term and long term.
THere are other drug-free options to achieve cognitive benefits. Exercising regularly, getting enough sleep, Blue light in the morning are some examples.
Users are exposed to a number of platforms in the digital world. Each platform has it's own quirks.
The typical strategy from a creator/publisher point of view is to create a core piece of content. Modify it for different platforms and publish them strategically. There are services that allow you to do this fairly simply.
Then there are open protocols in this space like RSS which enable sharing of text based content. Podcasts also follow a standard format allowing a creator to publish to different podcast players at the same time. Spotify announced an OpenAccess platform that will allow creators to publish their paid audio content on Spotify while still maintaining their direct relationship with their audience.
The need from the user point of view is quite clear here. We have preferences between different types of content. Some might like to read a full-fledged article or watch a Youtube video on the topic, or read a condensed twitter thread or, listen to it as a podcast form. While the underlying content is the same, the delivery channel is different.
The ways in which the user can interact with the content is also different in each case. In twitter they can retweet it or comment/like a twitter thread. While a blog post can be shared with others over email or any messaging social media app they use.
Monetization is different in these platforms as well. In Youtube the primary forms of monetization is either through the built-in AdSense or having In-Video sponsors. While the monetization for a blog/podcast could be a direct subscription model with individual users.
Prudential Algebra is a decision making algebra first invented by Ben Franklin. To boil it down it is a balancing act between the pros and cons of all the options involved in the decision stretched in a time frame of a few days.
The method as written by Ben Franklin in a letter to his friend..
In the Affair of so much Importance to you, wherein you ask my Advice, I cannot for want of sufficient Premises, advise you what to determine, but if you please I will tell you how.
When these difficult Cases occur, they are difficult chiefly because while we have them under Consideration all the Reasons pro and con are not present to the Mind at the same time; but sometimes one Set present themselves, and at other times another, the first being out of Sight. Hence the various Purposes or Inclinations that alternately prevail, and the Uncertainty that perplexes us.
To get over this, my Way is, to divide half a Sheet of Paper by a Line into two Columns, writing over the one Pro, and over the other Con. Then during three or four Days Consideration I put down under the different Heads short Hints of the different Motives that at different Times occur to me for or against the Measure. When I have thus got them all together in one View, I endeavour to estimate their respective Weights; and where I find two, one on each side, that seem equal, I strike them both out: If I find a Reason pro equal to some two Reasons con, I strike out the three. If I judge some two Reasons con equal to some three Reasons pro, I strike out the five; and thus proceeding I find at length where the Balance lies; and if after a Day or two of farther Consideration nothing new that is of Importance occurs on either side, I come to a Determination accordingly.
And tho’ the Weight of Reasons cannot be taken with the Precision of Algebraic Quantities, yet when each is thus considered separately and comparatively, and the whole lies before me, I think I can judge better, and am less likely to take a rash Step; and in fact I have found great Advantage from this kind of Equation, in what may be called Moral or Prudential Algebra.
One aspect about this method is the time frame recommended between different steps. There is enough time for both our conscious mind to spit out everything we know about it and our unconscious mind to spell out what it picked up on the matter. This also spills into a decision journal. Over time, the record of each prudential decision made would be a great feedback loop to assess flaws in your decision making process.
we have about 6000 thoughts everyday. What if we could record, track and classify all those thoughts. And feed it into a giant neural network to make a virtual version of yourself.
Writing, speaking, drawing, tweets, vlogs and all other forms of "content" we create are conscious to some extent. However our thoughts are a constant stream. Like a fire hose. Mixed with both conscious and serendipitous thoughts.
Could this virtual self be more objective and less susceptible to emotions. Or will it also have your biases built into it. Could you have this virtual self answer calls for you. Reply to messages?
Could we use this set of thoughts to spawn other forms of virtual interactions. Perhaps chatbots. Imagine a customer support chatbot modelled after Gordon Ramsay. Hilarious.
2021, among other things like toilet paper, boba tea and pickle jars, saw a shortage in limber. Let's try to understand what led to the shortage and the subsequent high prices.
After the financial crash of 2008, home building took a hit. Causing the lumber prices to fall. In addition to that, a lot of woodlands were lost to bark eating beetles. And this eventually led to the fall of prices in 2008.
Prior to the Covid crisis, in a similar fashion we lost about 6.2m acres of woodlands to wildfires in 2017-18. And when the pandemic came, investors expected history to repeat itself and they were short on the prices (expecting them to go down). Similarly, wood suppliers cut production in anticipation of low demand.
But what happened was that more people wanted to renovate their homes, and complete all the projects they always wanted to do. Now since, they have no other option but to do it. Combine a weening supply and a surging demand led to a 377% increase in the price of wood this year.
Greenfield and Brownfield investments are ways in which a company can expand into other countries. They are 2 different types of foreign direct investment.
Greenfield investment is when the company builds necessary resources from scratch in the new country. This could include building new plants, distribution centres etc. Greenfield option makes more sense if the business operations are unique and custom made for the company. It is more risky as it is the more expensive option of the two. If the operations are near one-of-a-kind, it would be cheaper to build it from scratch and operate it like existing parts of the business. A good example of this is the Gigafactory tesla is building in Germany.
Brownfield investment is when the company relies on acquiring existing companies and facilities. This makes sense when business operations are rather existing. In some cases, facilities are leased instead of purchasing. This could be an interim solution before companies can take the leap to build their own facilities in a new market.
This way of thinking can be applied to personal projects as well. Would it be easier to update your personal website by making tweaks or to rewrite it from scratch.
Excel is a remarkable piece of software. It is used by people from every walk of life. Whether you are developer, data scientist, statistician, analyst. You name it. You can have these wonderful tools that can spit out different kinds of data. Or have scripts to do complex calculations. But at the end of the day, you copy it to an excel workbook to send it off to someone or to do some simple tasks like extrapolation, sorting etc.
They even had to change a.
Excel is unbundling. SaaS like Airtable and Google sheets are the biggest of them. They connect a typical column-row sheet to the internet. Opening up lot more possibilities. And these services also connect to. Making it possible to connect it to other applications to read, write or manipulate data.
Excel has a very peculiar architecture. Most pieces of software get more bloated with more features. However, the opposite is true for Excel. Over the years, the amount of features introduced strengthened and expanded it's use-cases even more. That is a very hard thing to achieve in the software world.
A special economic zone(SEZ) is a designated area in a country with special economic regulations to incentivize foreign direct investment. Usually these incentives are in the form of tax reductions, labour regulations, customs etc.
A SEZ became popular in the mid 20th century. The primary purpose were to attract foreign investments. Now they are being used to attract domestic investments and for eventually encouraging investments outside of the zone as well.
A second order outcome of having a special economic zone that attracts a lot of companies in the same geographical is the there are more chances of cross-pollination. In terms of employees, competence and technology. Synergies like having a large part of supply chains concentrated in a small area is attractive for businesses. It can save costs and both ends of the chain and improve the pace of innovation. A textbook example is the Shenzen Economic Zone. It really found the sweet spot for manufacturing electronic goods and soon dwarfed global supply chains.
The economic activity from a SEZ usually spills over outside of the SEZ too. SEZ that are big enough can eventually have cities built around them to facilitate the labour market and provide better logistical support for the SEZ.
Barnum Effect, also known as Forer Effect, is the tendency to take general statements and interpret it from a personal point of view. A very common example where the Barnum Effect is exploited is in horoscopes and fortune telling. You can commonly see sentences like "When something good lands in our lap, we frequently fail to appreciate it, because we immediately begin the process of protecting, coveting, and ultimately hoarding our treasures." This sentence can have different meaning because as we read it we relate it to our life experiences.
The more personal input you give in the process, even stronger is the effect. Personality tests like MBTI can seem more accurate then they are. Especially if it is combined with confirmation bias. If the tests, for any reason tells you something that you knew or wanted to be true. It is highly likely that you will believe it. People won't believe or pay attention to general statements that are considered negative.
One use case that will for sure be filled with cryptocurrencies like Ethereum in the coming years is that of internet money.
Just like we had an era of web-first applications and shops, which then became mobile-first. Internet-first money is yet be fully realized. Money that is not tied to a physical entity per se, but at the same time can store and be a medium of exchange of value over the internet.
Money, Paypal credits, Credit are all good forms of value individual users or businesses use to transfer wealth between each other. This can be done via online too but the underlying mechanism is still tied to bank accounts and balance ledgers maintained by different banks. The internet only acts as a proxy for the transaction.
Internet money, on the other hand has the capability to sustain the transaction and it's underlying mechanism end-to-end within the internet. Imagine if API's wanted to exchange money and not users or businesses. Internet money would be the default way to do it.
By the looks of it, cryptocurrencies are the most poised to fill in this use-case.
Thework ethic is breaking down what ever you want to be better at into a chunk that can be performed every day. And then doing it every day. No excuses. No ifs. No what ifs. Just do it. Every single day.
He famously started daily vlogging to put an impetus on himself to film something everyday.
The secret sauce is keeping the daily task at the boundaries of your current ability. That provides enough motivation to work on it and get better but at the same time not too easy or boring.
This won't be easy. Doing something hard deliberately requires.
What you see is all there is.
Coined by Daniel Kahneman. Even though we like to think of ourselves as rational and objective human beings when making decision. The truth is that we are very well affected by our psychological biases and emotions.
WYSIATI is the bias that leads us to make judgements and conclusions with the information that is available to us. Without considering what other piece of information might be missing. We have the tendency to bend the pieces of the information jigsaw puzzle when some key parts might be missing. Our brain chooses the path of least resistance to form the big picture. It is a machine designed to jump to conclusions.
One way to combat this is to have a structured thinking process and a system to evaluate decisions. Deliberate decision-making can be practiced and it takes time and effort.
Different currencies around the world have different purchasing powers. Purchasing power is defined as the amount of goods or service you can buy for the same amount of money. Purchasing power not only varies from country to country, but also varies over a period of time.
The amount of goods you could buy for 10$ now is very different from what you could buy in the 1950's.
Different countries have different interest rates, budget deficits, inflation/deflation, import/exports, GDP. And all of these factors contribute to different purchasing power.
One common way purchasing power is compared across countries is buy calculating purchasing power parity(PPP). PPP is a measurement done across the world that uses the price of specific goods in each country.
To avoid large errors in this calculation a basket of goods is chosen. The categories include Food and Beverage, Housing, Transportation, Medical Care, Education, Apparel, Recreation etc.
The Economist introduced the Big Mac Index on 1986. This became an informal way to measure the difference in purchasing power in various countries. The Big Mac index is the price of Big Mac in the respective country compared to a reference to give an idea of what the purchasing power of the currency is.
However, the index can only be calculated in countries that have a McDonalds. Another limitation is that the Big Mac Index can end up measuring the local willingness to pay for a fast food burger.
Hindsight bias is a tendency to look back at past events and perceive events as more predictable then they actually were. For example, there are many ways a global pandemic could have played out, and there was a lot of uncertainty at the beginning. But now it would seem more predictable as there was no other probable course of action.
When looking at your own past decisions, it might seem that you made the right decisions but for the wrong reasons. We don't want to be critical of ourselves. And our brain takes a shortcut to avoid that and to explain it in a much easier way with the information of the outcome that you have now.
The only way to keep a check on this is to keep a record of how a decision was made. And evaluating them at a future point of time.
A common shortcut our brain takes during comparing two options is to, at least for the moment assume everything else is the same between the 2 options. Hoping to get a more objective judgement of the situation.
, if all else is equal, the simpler explanation is preferred over a complex one. However, it is rare in the real world to have to decide between 2 options where they only differ in one dimension.
This mental model can backfire if not used with care in the real world. It is important to understand what are the other parameters that is assumed to be constant. Whether it is possible in the real world where a scenario would exist with similar options in all dimensions but one. The bottom line is that the comparison should be made between 2 possibilities that have a chance of existing in the world. It doesn't help if the comparison is done between 2 hypothetical situation and using that judgement to choose a real world option.
Figuring out which parameters of the system affect it directly and indirectly is a good exercise to perform. Comparing indirect parameters is a recipe for a disaster. And by direct parameters, I mean parameters that are fundamental to the characteristics of the system. Fundamental, also read as, first order implications instead of derived aspects. For a business, it could be profit after tax, for a business loan it could be interest and time period.
That said,uses this principle and is a good method to get a better understanding as to how different variables affect the outcome and behaviours of the system under question.
A term I came across from Mental Models Vol. 1. But this is something that I have found myself doing and has been useful in helping me make decisions.
When presented with a decision that is a) affected by a lot of uncertainty the future holds b) a large set of possible outcomes can be guessed, then having a theoretical world generator helps.
Having a checklist of aspects to consider is a good idea. A set of checklists covering different dimensions. Maybe they can be classified as internal or intrinsic aspects and external ones. Or can be classified in terms of time.
Thinking about the possibilities in a form of a tree can also help.
After mapping out most of the outcomes and possibilities. The next step is to individually think about the probability of each outcome from happening.
Now here is the kicker. What makes this a really neat is that we have to start thinking about the different outcomes and work our way backwards to the event&decision. Thinking forward and backward through a possibility tree.
This exercise can easily backfire too. So figuring out on what problems this approach works and doesn't is key. This takes a lot of trial and error, increasing domain knowledge and in general being able to ingest a large amount of information before running the TWG.
TWG is a good way to evaluate your past decisions. Given the information that you had at the time and if you run a TWG now for this decision, how would your decision change.
"I rather be roughly right than be precisely wrong". Avoid the craving to go into details of outcomes of trigger that should happen. Keep it as broad as possible. When it comes to figures and statistics, look at ranges and trends not at actual figures down to decimal
The first time digital interfaces came into existence, designs were quite basic. Straight lines, simple shapes and text. But soon after that, with higher resolution displays and faster GPUs the possibilities became endless.
The earliest designs in digital GUIs resembled a much of real world objects. Skeuomorphic design aims to emulate the aesthetics of physical design. This type of design was very common in the early designs at Apple. That eventually changed to a more digital-first design, also known as the flat design.
Skeuomorphic design is common in digital plugins in the audio production domain. Knobs and meters resemble the actual physical instrument.
Problem of Points is a classical problem in Probability.
The setting of the problem is a game played between two. Each round of the game has an equal likelihood of any player winning. It is agreed upon before the fame that the first player to win a certain number of rounds wins the game. The proceeds, equally contributed by both players, is given in its entireity to the winner.
Now, the question is, if the game were to be interrupted for some reason, how can the pot be divided fairly.
Pascal and Fermat worked on this problem to come up with a solution. The tenets of which is what forms the basis of expected value in probability.
To keep it simple, the solution involved calculating the odd of each player winning for the subsequent rounds that were to be played. And splitting the pot based on that. While Fermat provided a logical solution to compute the final split of the pot, Pascal came up with a solution that could easily compute the pot if they were to play a certain number of rounds.
This activity involved drawing up a decision tree to map out all possibilties and their odds in an efficient way. This gave birth to whats now known as the Pascals Triangle.
Every successful business has done at least a few things right if not many. They have to find a product or service that they are good at. Possess some characteristic that differentiates them from their competitors. Find a market to sell to. And be profitable at the same time.
Businesses in itself are a beast of an animal.
Then there is this special breed of businesses. That can not only do all of the above, but can spawn off other businesses at the same time. By spawning off, I don't mean extending an existing product line or expanding into nearby verticals. But building entities that are not in their primary field of expertise and building them into independent businesses.
Amazon is a good example of this. Starting from selling books online and expanding that to other verticals is quite logical. But the spawning off of businesses like Amazon Web Services, Amazon Prime, Kindle, not quite. But that did not come for free.
There were 2 major factors that enabled them to do this. Access to capital and ideas for businesses that have the possibility to return high ROI. Internet based businesses have a very high ROI, given the low investment required.
Having both access to capital and serially trying one idea after an other is a deliberate effort for sure. And the more tries you get, higher the chances of success.
A note on decision making. A decision is good or bad not based on the outcome but based on what information was available at the time the decision was taken and what heuristics were applied to reach a particular decision. Tweets like these are quite common. "If you had invested 10$ in X 10 years ago now you would be a millionaire". And that is evaluating a decision based on its outcome. But what you should be thinking is what information did you have then, what was missing, what could have been predicted given the information you had, what mental models and principles were at play and why that happened. This kind of thinking is crucial to improve how you think and to make it better. Dwelling on your past decisions for most is a more tiresome exercise than a fruitful one. To apply this without messing it with your head requires a good amount of self-awareness and discipline.
Comparative advantage describes how trading parties will choose to produce more of a good that they have a comparative advantage of, and use that to trade for other goods that they don't have a comparative advantage in. Comparative advantage forms a foundation for international trade.
Comparative advantage is why there are call centers in India, manufacturing in China and highly specialized capital intensive labor in the US. It makes sense for China to do all manufacturing electronic goods and trade it to other countries that are specialized in producing other goods.
Conversely, an oil producing country will have a cheaper access to raw material for chemical products. They have a lower opportunity cost of producing more chemical products compared to countries that do not have access to cheap raw materials. This drives more chemical production in oil producing countries.
However, there are cases where countries, governments and business lobby together to protect niche interests. This can effectively keep foreign, cheaper goods out of the market to protect domestic businesses. However, in the long term, this might not good solution as neighbouring countries will be better off as they have to spend less to get the same goods.
Parimutuel stands for mutual betting. All the bets of a particular type are placed in a pool after deducting the house cost. This kind of betting is different from fixed-odds betting in that the payout is not known at the time of placing a the bet.
The payout is determined after the event has taken place and the pool is split to the outcome sustained. This kind of betting is common in horse racing.
An important innovation in this betting system was the introduction of a Totalizator. Nowadays, these are computers. A totalizator read the current pool and betting spread and calculates the odds and payout and show it to the public. This encouraged even more betting as it gave a feedback to the ones betting.
Pricing power allows businesses to price their product or services higher than the rate of inflation and that of the competitors without reduced sales. Pricing power can come from a few ways. A very high quality product. A patented efficiency or feature of convenience that competitors cannot match. Or when a business is operating as a monopoly and barrier for entry is quite high.
A key indicator of the strength of a business is its pricing power That said, it does not mean that if a business does not have good pricing power it is a bad business. However, the opposite has a much higher probability to hold true.
In some businesses untapped pricing power is a good indicator that it is mispriced by the market. To reach such a conclusion one must have an idea on the costs and margins of that particular business against the backdrop of it's competitors.
High pricing power can materialize as high ROIC over the years. But keep in mind that, high ROIC can be a result of many other factors. Operating and gross margins also provide an indication of the pricing power a company has.
Theis a statistical measure of dispersion. It is an indication of the spread of data similar to standard deviation and variance. In economics, the Gini coefficient is used to represent the income inequality within a demographic.
This coefficient was developed by the Italian statistician Corrado Gini. The Gini Coefficient can be calculated for both continuous distributions and discrete data.
Over the last century in developed countries, there is a trend of increasing inequality. While in developing countries, this is a decline in inequality since the 1980's. THis can be attributed to all the new jobs created due to technology and globalization. Countries following thetend to have a lower overall Gini coefficient.
Gini's coefficient is used in other disciplines. For example, it is used in ecology as a measure of biodiversity in a region. In health, it is used as a measure of inequality of health related quality of life in a population.
The Flynn Effect is the increase of intelligence test scores over the 20th century. The IQ measure is measured around 100 and their standard deviation on either sides. Over the years, the mean is adjusted based on the sample population and it is observed that the mean shifts upwards a few basis points.
There are a few explanations as to why this is a trend.
Total schooling time has increased steadily over the years. This means that kids would have had more chances to practice their analytical and cognitive skills.
Tests similar to IQ tests are used in a lot of scenarios. The familiarity of such tests has grown.
There is a more stimulating environment while kids are growing up now vs. back in the day. More video games and television stimuli. There are studies that show better cognitive activity and hand-eye coordination among those who game more.
Nutrition, both in quantity and quality has improved over time. There is data that suggests that the brain is ever so slowly growing as well with the improved conditions.
Better health conditions. Developed nations in general are more equipped to handle infectious diseases. Vaccinations are part of everyone's growing life.
Light sails, aka Solar sails is method to propel a spacecraft using light. The light from the Sun exerts a radiation pressure on these sails that accelerates the spacecraft. However, the magnitude of force is quite low. A sail of dimensions 800m x 800m will experience a force of about 5 newtons at Earth's distance from the Sun.
What exactly happens is that only a fraction of the photons that hit a light sail is reflected. Some of it are absorbed. And this causes a transfer in momentum from the photon to the light sail. For the momentum to be of any considerable value, there has to be a lot of photons and the light sails have to be as light as it can be.
Omuamua is the first interstellar object detected as a passing object through our solar system. Interstellar, meaning that it originated outside our solar system. These objects are hard to detect before-hand that we can be ready to send a spacecraft behind it. In addition to that, the speed of these objects are quite high to keep up with conventional propulsion methods. Light-sails could be a good candidate for probes that can be used to patrol the boundaries of our system for such objects in the future.
A layman idea that has permeated the investing world is that if you are using a product or service then and you enjoy it, then you should own the stock of the company behind it. This is a good example of why Albert Einstein said "Everything should be made as simple as possible, but no simpler”". By making a concept simpler than it should be, it has omitted some important aspects of it.
There are a few shortcomings to this idea. The fact that you like it does not mean everyone else likes it. And is also not any indication of what the business might expand into later. It does not give an idea of what it's competitors are. Maybe the competitors don't sell or market to where you live.
The business behind it can tell a totally different story. They can be debt ridden. Found malpractice. Or fighting a patent lawsuit. Anything is possible.
Households names are a good starting point. There is something in you that made you purchase the product or service. That in itself is a pre-requisite for a good product. But analysis shouldn't end there. Looking into the financials and the story of a business is key to investing as much as having a mouth feel of the latest flavour of coke.
All products that are aiming for a sizable market has to be generalized to some extent. To make it efficient to create, distribute and market the product it was essential that the product appealed to the masses. Having minimum variability was key to optimise the business as a whole. Take Apple for example, Back in the 2000's they were successful partially because of their relatively small product line. Less options for the customers to choose from but why would they need options if the product was just so good.
On the other end of the spectrum are products that are bespoke to an individual. A good example are tailored outfits. By definition there is a need for the product to be unique and match a person's style. Usually these kind of businesses rely on physical and more tangible parameters of a customer to make a customised product. With the reduced cost of doing a DNA test, there is a new vertical altogether that opens up a possibility for a business to create hyper-customized products.
is a company that helps you map out your ancestral lineage using a sample of your DNA. As more and more customers use the service, the better their data and reach will become.
is a business that aims at creating a specialized diet plan based on your DNA. Based on the combination of genes that a person has, Genopalate can come up with an optimal nutritional plan. It can provide analysis on how our body digests various types of food and substances(caffeine, alcohol etc.).
A forgetting curve is a representation of memory retention over a period of time. Based on some early studies conducted by Ebbinghaus, it was shown that the forgetting curved follows a logarithmic curve. Given that there was no attempt to retain it. But how do you retain a memory.
Memories in our brain become more consolidated when we try to recollect it. That is why being tested on new concepts help you remember them more. However, testing yourself right after you learned something doesn't help. This is where spaced repetition comes into play. The more spaced out the repetitions are more you can retain with a reducing effort every subsequent try.
When we encounter a new experience and we recollect an already existing memory. There are two possibilities that can happen. If the new experience is in agreement with the existing memory our brain consolidates even more. If it is differing, our brain creates a new memory based on the original memory. A representation is shown in the below picture.
The Lindy effect is a phenomenon where the future life expectancy of an item increases with each year survived. Counter-intuitive as it may seem, there are a few explanations to this effect. The effect mainly applied to non-perishable items. Items that do not have an expiration date governed by unstoppable forces. Time plays an imprtant role in this effect. Longer an item is in play, the better fine-tuned it becomes. More equipped it will become to survive the coming times.
Books are a good example where this effect can be applied. If a book has been repreinted for the last 40 years, there is a good chance that it will be for the next 40 years. The explanation being that the content of the book has seem to find appeal to a demographic that is renewed over time. Though the medium might change from printed books to kindle to audiobooks.
Items or systems that have built a self-enforcing loop around it tend to follow the lindy effect. Businesses with moats around them are an example. They have a competitive advantage that compounds over time. Such businesses become harder to fail as time passes by.
Choosing what to read can be guided by the Lindy effect. Choose books that are foundational in the area to start with. Coincidentally, those books have been around for some time. The same applies when doing a literature review for some research. Start with survey papers. They, by definition, cannot go out of trend. Hence the popularity of survey papers.
Typically factories follow a serial architecture. At the beginning of the chain is the raw material. Which is then passed onto a series of stations that perform a specific task on the material. This enables large volumes to be produced and opens up possibilities to optimize material flow and tasks performed at each station. But one drawback is that it is difficult to manufacture different variants or even different products on the same production line. Serialized manufacturing units are common in a lot of industries.
Microfactories are a relatively new paradigm for manufacturing units. The factory is split into cells. Each cell is equipped with a set of robots and/or human operators that performs a series of tasks. Think of them as an averaged sum of a few stations from the conventional paradigm. These robots can be programmed to build different products and have various end-effectors for performing a set of tasks instead of just one. What this does is now each of these cells can be programmed to build a whole line of products instead of just one, without compromising on the flexibility.
, a UK based electric vehicle startup is pioneering this to develop microfactories to build their vehicles.
A digital twin is a collection of digital data representing a physical object. The concept of a digital twin is born within engineering disciplines. A digital twin is created to optimize and test designs even before they are manufactured. The relation between the digital twin and the actual one doesn't end there. Once manufactured, data and measurements are fed back to make the digital model more accurate.
In some cases, digital twins can be used to predict the life, wear and fatigue of an object before it happens in the actual object.
The digital twin is extended outside of an object within the autonomous driving domain. A digital world is created to mimic the physics and randomness to train and improve models that drive autonomously.
As the cost of computation and the necessary hardware required for doing so comes down, there is a possibility that the digital twin of an object could be embedded within the object.
Complexity theory and related concepts emerged quite recently, towards the late 20th century. Complex means diversity, multiple concurrent interdependencies between different elements of a system. Human cells in itself are a very complex system. However they can self-organize and form groups. Create variations and give rise to even more complex systems like a human.
Complexity theory studies complexity systems. A system composed of many components that interact and have different dependencies with each other. Key concepts include systems, complexity, networks, nonlinearity, emergence, self-organization, adaptive.
A business is also a highly-complex system. Comprising of complex individuals both in an around the business interacting with it simply by creating products, selling/buying them.
is a protocol that can be used to store and share data in a peer-to-peer system. Similar to HTTP, the protocol on the internet to share and store information. One problem with HTTP is that the computer requesting for a webpage has to directly establish connection to the server storing it. Therefore transactions are serial and singled-out.
IPFS on the other hand can download a file from the network from multiple sources. IPFS creates a a cryptographic hash for all the blocks within a file. These blocks can be stored in multiple and different locations. IPFS uses a decentralized name server IPNS (equivalent to DNS for HTTP) to map human readable filenames to these cryptographic hashes. This allows a the requesting node to download several pieces of the file simultaneously from different locations.
IPFS essentially decouples the content from the addressing on a network. This would allow the requesting node to download a file from the nearest possible location instead of being bottlenecked to one location.
Berkson's Paradox is a type of sampling bias. This paradox can lead to experimental studies concluding that 2 events are related when they are actually not. This was first identified in case-controlled studies.
A study byportrays this paradox quite well. He wanted to study the presence or absence of respiratory disease and locomotor disease. He took 2 samples. One from a random community from the general population and one from the hospital. When looking at the results from the hospital sample, the data points out that the it is much more likely to have a locomotor disease if you have a respiratory disease. But it is not true. This correlation emerges from the fact that it is more likely to have patients that are admitted for both diseases in the hospital, while we are not considering the part of the population that neither has a respiratory disease nor a locomotor disease.
When looking at the results from the community sample it was clear that there is no correlation between the 2 disease.
Control group based studies are common in many studies not only restricted to healthcare. Especially around studies to evaluate industry and consumer trends. It is important to look at where the data is coming from i.e. who is in the sample population and how that relates to the big picture.
Blockchain are now widely used to perform transactions between two parties where the blockchain ensures the validity of the transaction and records it in the distributed ledger. However, for a transaction to be completed it would require all the individual peer copies of the ledger to be updated. And this can be very time and resource intensive.
The Lightning Network is a protocol that works on tops of a blockchain that allows for transactions to happen off the chain. The protocol allows for a payment channel to be open between 2 parties which is validated by the blockchain. Once a payment channel is set up, all transactions between the 2 parties can happen quite instantaneously. The record of the transaction only needs to be updated in a certificate tied to the payment channel. Once a payment channel is closed, it is recorded in the main blockchain.
An example, a customer can have a payment channel open with a coffee shop and they can use it to pay for coffee everyday. Once the payment channel is closed, say once in a year. The records are ratified and added to the main blockchain.
What the internet together with the smartphone has now done is that it democratized accessibility. Any website, article, tweet, video, image, product, service on the web can be read by anyone in the world. But this has taken out the geographical localization from the experience of buying something or experience a product.
In most cases it is more efficient to just order for food in an app rather than to physically go to a restaurant and pick up an order. The same applies to e-commerce, flight and hotel booking etc. Businesses that rely on the hyperlocal aspect for its business still need a network of employees to run its operations. Whether it is a delivery/logistic arm of Amazon or distributed pockets of people who are ready to rent out a space to strangers.
The scalability of such systems on top of a stack like the internet is huge. Nothing new here. But what that means is that, for a given reward structure, a biggest fish WILL have to emerge in almost every pond. Not completely extinguishing it's competitors. But leaving enough room for bespoke peers in the market.
An example is Amazon followed by all these niche e-commerce websites. Like Google followed by the rest.
Our skin has many types of neurons that allow us to feel upon touch. They are receptors of different kinds that are triggered by different stimuli but activate the same way upon triggering. Thermoreceptors trigger on change in temperature. Nociceptors trigger on pain. Mechanoreceptors trigger on mechanical stress. These receptors send signals to the spinal cord and the brain to register a touch.
A combination of these receptors spread across an area leads to a triggering in various nature depending on the stimuli. This is how we are able to distinguish textures and types of touch.
In general, robots are really good at predictable motions that can be broken down to a set of axis and have a known distribution of forces. This is why a robot is used for welding different parts of a vehicle body but a human operator is used to install an intricate wiring harness.
Tactile feedback would be a large improvement in the feedback loop for robots if they need even a fighting chance to learn more complex tasks. A new technology developed by aat the University of Hong Kong allows robots to detect tactile inputs at a super-resolution.
The system uses a flexible magnetized film as the skin and a printed circuit board for the structure. The film creates a magnetic field within the device and the subtle changes in the field is sensed to determine the touch.
Market capitalization is the total cash-value of a business. AKA market cap. The market cap is calculated by multiplying the current number of shares outstanding by the price of one share. For example, a company having 1000 shares of 10$ value would have a market cap of 10,000$.
Companies are usually classified based on their market as small-cap, large-cap etc. Microcap refers to a companies that have much smaller capitalizations. In the U.S. it refers to companies having a market cap of roughly 50-300M$.
Microcap companies are not that popular. And not that well.covered by mainstream media. Hence the volumes traded on these stocks are quite low. This can be risky as it might be hard to offload or buy into large positions.
Since the companies are not that well-covered, the data on these companies are generally good. Less noise overall. And these companies are usually easier to analyse as they have simpler businesses and product lines compared to larger companies.
A supply chain is a system of activities that is required to convert raw materials to the final product spread across in time and geography. Traceability in such systems make it easier to understand what is happening at each step of the process. This is critical especially in the Food and Pharmaceutical industry, where they are trying hard to trace each component that goes into their products.
With better traceability, it will be easier to enforce regulations. Regulatory authorities can more easily make sure that there are no banned substances used in any product or if any banned processes were used.
From a business point of view, it will be easier to optimize value chains for each product and find possible synergies. Currently all businesses do this, but at a higher level. Not every company is interested in raw materials right from the ore to the form they use. But traceability would make it more transparent for businesses to see the history of each delivery and the complete chain.
Within the food industry, there is a growing need and market for products that are ethically sourced. Big companies want to be seen on the right side of history when it comes to sourcing from local farmers and institutions. Traceability in supply chains is one way to improve their overall brand image., the Fairtrade initiative. Which is then used by downstream companies like "Ben & Jerry's" who source cocoa, sugar etc. under this initiative.
There are a few ways traceability can been introduced to a supply chain.of their products traceable by using barcodes. RFID tags, alphanumeric codes are possible solutions as well. Blockchain is a relatively new technology that is being used for this purpose. A distributed ledger that cannot be tampered with keeps track of the supply chain. And all stakeholders are free to check the validity of it themselves. A drawback with using blockchains is that these chains become stronger and more viable when the size of the information is huge. For smaller amounts of data, a centralized approach .
Usually supply chains include contracts between different parties right from the start until the product reaches the customer. Blockchain maybe is a more viable solution to manage these smart contracts in one of its ledgers. These agreements are usually repetitive in nature and can span across different geographies, time-zones, currencies etc. A distributed ledger in this case could act as a common protocol used by all stakeholders to manager and maintain contracts.
Unit Economics is a neat way to look at the fundamental economics of a business. A sanity check. It starts with defining a "unit". A "unit" could be a product sold or a customer acquired depending on the kind of business and business model. Unit economics would then describe how much value is created from this "unit".
Unit economics help you understand how much has to be given to perform a unit transaction that is core to the business. Especially when analysing a business, unit economics provide a quick way to check the feasibility of the business. It would be quite clear if an insane amount of money has to be put into for acquiring one customer. Or if the charges on logistics were eating up too much of the margins.
The unit economics and it's various components can be projected out in the future. This will give you a fair idea as to how the business would have to perform to stay solvent. Whether that is realistic or not is another question. Comparing figures in a per-unit basis eliminates the possibility of comparing different things all together. We do not want to compare apples to oranges.
An example of a unit economics for a restaurant would look like this. From a basic sense, a restaurant cooks food for their customers with a marked up price to earn a profit for the product(the meal) and service. So the margins of a restaurant meal would be the difference between the price of the meal and the cost that it took for the restaurant to make it. The cost can then be broken down into the hourly wage of the workers, the cost of running a kitchen, the cost of the ingredients etc. This would give us an idea as to how much would have to be spend and what each dish has to be priced to obtain a margin that would justify running a restaurant business.
Ludwig van Beethoven is rumored to have invented this technology. The famous composer who was also deaf used to press one end of the rod on a piano while having the other end of the rod in his mouth. This would transmit the vibrations allowing him to hear. Rumor or not, but that is a gist of how bone conduction works.
Sound is perceived by our ears through the variation of pressure in air. The ear drum vibrates based on these pressure differences transmitting the vibrations to a set of small bones followed by a fluid filled cochlea.
Bone conduction used the bones of the skull to transmit the vibration to the cochlea directly. Bypassing the ear canal, ear drums and the set of bones in our ear. Bone conduction earphones usually sit on the cheekbones. The isolation and audio quality on these headphone are bad due to the fact that the ear is fully exposed. However, it provides people with hearing deficiencies and hearing loss another method to perceive sound.
used bone conduction technology in it's devices to transmit information to the wearer. , moonshot lab, is working on super hearing technology. They are working on technology that will allow the separation of voices of specific individuals from that of the group.
Dr. Robert B. Cialdini wrote a book "". A key part of the book was the 6 principle he postulated as the principles of influence.
An ultra niche is a niche within a niche. A minority within a smaller group. The outliers of the pack. The internet makes it possible to cater to such ultra-niches in a viable manner.
A piece of software that solves a particular problem for 100 people. Only 100. They would be ready to pay for the product since it is the only product or the best product that solves their exact problem. The amount they would be willing to pay depends on 2 things. First, how much of a pain the problem is for them. Secondly, how much value can be created for our users by solving this problem. What “potential” is being unlocked by solving this problem and how big of a “potential” is that. It can be in the form of freeing up their time or helping them do something more efficiently.
Finding the exact 100 people with this problem is the hard part. Maybe forums, reddit, fb groups, twitter all are ways to find people within a niche. Of which to filter again to find the 100 in our ultra-niche.
Super Apps, a term that became mainstream thanks to aby Andreessen Horowitz. A super app is a closed ecosystem with multiple apps and services that work together seamlessly. They offer a wide range of options to users within this ecosystem.
is the classic example of a super app. WeChat started out as a messaging app. Which then branched out to be a social media app. Eventually, they integrated financial services that allowed users to send money to each other, order services, order in restaurants and use WeChat as a default payment method.
took a different path. They began as a payment app. And then integrated other features.
Why are they becoming more popular? From a user point of view, super apps can tie in different services and offer user experiences that other conventional apps just cannot. A single app is perhaps better at keeping the attention of the users. Less context switching.
This is also a direct outcome of API's fueling digital growth. Most companies would like to build a brand and business around a service or a product. Some companies are better off with just offering an API and letting other businesses figure out the rest of the value chain. Getting features/integrated to a super app is like getting featured in the front page of reddit, but for API companies.
Even though these kind of apps are most common in Asia, there is growing interest for super apps globally. Google is probably at the right place at the right time to capitalise on such a service. They already run a tight ship with their suite of apps. But packaging them into a coherent set of features of a super app might not be that far away.
Grid cells are special types of neurons that help us perceive position in a larger context. For example, to understand our position in the room we are sitting in. The cells themselves are arranged in a manner of a grid and fire based on our position. An array of such cells successfully encodes location, distance and direction. Grid cells are seen in the neocortex of the brain.
The neocortex is the part of the brain that is involved in higher-order brain functions such as cognition, spatial reasoning and language. A classical view on how a neocortex works is that it receives sensory inputs and is processed in a set of hierarchical steps. Where the sensory information is passed on from one region to another. It is assumed that a high-level object can be grasped when the information has passed through all the regions once.
proposes a new theory. It starts by saying that there are more of grid cells in the neocortex. Arranged as columns and rows. Each column creates its own model of the objects based on sensory inputs. Each column would build a model based on slightly different inputs. These models than vote to reach a consensus on what it is sensing. As if, there are many tiny brains within our brain and what we sense and perceive is a weighted average of all the outputs.
One of the key ideas for investing well, over a long period of time is to handle as much of thinking using structured methods. Leaving things open and to be decided by chance is an easy way for it to get affected by emotions, biases and the chatter in your head. A downside to using systems to invest is it is very easy to fool yourself that you have found a silver bullet.
"To a man with only a hammer, every problem looks like a nail" - Charlie Munger.
Systematic thought processes can be achieved in many ways. A well-known approach is having a toolbox of mental models. Another popular way is to use algorithms.
An algorithm is a well-defined set of instructions that can be executed to achieve a solution. Algorithms help make the thinking process more deterministic. These algorithms can be based on different heuristics. Screening is a common practice to filter out companies that satisfy different conditions. Usually conditions on various financial ratios. Filters in itself is an algorithm.
are probably the most simplest version of an algorithm. They are a very "analog", pen-and-paper way of putting a system in place.
In the context of investing, systems like checklists and algorithms work better when they are not defined down to the last decimal point. There needs to be a variable part to the system that can change depending on the context and company in question. This is where investing turns into something of an art. Knowing when to use what thought process to value a company.
In some cases, the system in place won't have a variable part in it. And the complete process can be boiled down to a well-defined algorithm. Once the system reaches that state of maturity, computers can take over and do a much better job then us. Algorithmic trading and investing allows to push the boundaries from 2 perspectives. Trades can happen in a whole new time domain. Instead of timing in terms of time of day, trades can be orchestrated with a resolution of milliseconds. Secondly, the limit of quantitative analysis can be pushed considerably if we let computers take care of the complete process.
Walt Disney used this method to come up with new ideas and subsequently break them down and refine them to end up with something viable.
The process is split into 3 parts:
Walt also incorporated a physical aspect to this method. Each of this stage would be done in different rooms. This is a really powerful move. It engages and cue's your brain to get into a particular mindset each time you enter the room.
Media in the past were limited by the medium. News by physically printing newspapers. The music industry by CDs. Television by cable with limited channels that could be programmed. Distribution in these mediums were inherently limited. The internet and smartphone duo breaks this.
Until then services were usually bundled. You didn't have to subscribe to Sports News, World News or Business News separately. They all came as a bundle. The same goes for television. The unit economics made sense to bundle these even if not all customers are interested in each product or service. However with the internet it became easier for services to offer these individual products with no additional cost. And now we are seeing the great unbundling as Ben Thompson wrote in a 2017.
There are two problems from the customer point of view. Today there are just too many subscriptions. According to, as of mid-2019, the average American subscribes to 3.4 streaming services. Managing subscriptions, payments, logins and being able to find the right content for you to consume is often a task in itself. Secondly, most customers have a monthly subscription budget. Which means that they have to choose what they would like to subscribe to.
Recently, there is a great influx of individual creators trying to carve out a space for themselves.has popularised and hyped that anyone with a mailing list could start creating a content and put some of it behind a paywall. Creators focus on niches to gain some ground initially but eventually they too diversify and spread out. Which is not a bad thing, but is it enough to justify the monthly paid subscription even then? Probably yes, because by then readers are not only buying into the content but also the brand around it.
A possible solution where this is headed to is another wave of bundling. The great bundling of niches. An app store of sorts that can provide a wide array of content ranging from Netflix to Substack newsletters, from News shows to sports. There could even be sections for individual creators, journalists and writers. Customers can then mix and match what they would like to subscribe to.
One subscription to rule them all.
Software has enabled a very low cost for product variability. Unlike hardware, different configurations of software product doesn't mean different production lines, bill of materials etc.
From a customer point of view, this can be seen in two ways. Giving them more options makes them feel that they are in control. They are actively taking part in the buying process. Instead of being skeptic based on what a sales person would tell you. On the other hand, giving them fewer options reduces the friction to the buying experience. One of the first things that Steve Jobs did after returning to Apple was to clean out the product line. Even though Apple's current product line up is scattered all over the place, they have stuck to a smaller number of product categories.
From a business point-of-view this makes it easier to offer customers multiple products with low additional cost. These custom products can be sold in a modular way and customers can pick and choose to create the final product. The App Store works in a similar way. No two iPhones end up being similar. They can have the same hardware, but each person personalizes by installing apps to their taste.
Prospect Theory models how a person might make a decision given the probabilities of outcomes and possible value of the outcomes. We all have a tendency to assign probabilities to possible events. Maybe we don't do it consciously. But at least in the form of gut feeling. Probability weighting is the tendency to assign extreme events with higher probabilities. We behave differently to potential gain and loss. The reaction in fact is asymmetric depending on whether you are risk-seeing or risk-averse. This is dependent on the internal reference we use as well. Given a certain reference point, an outcome might seem more/less risky. This contradicts with expected utility theory which models the decision making of a rational individual. Our behaviour is not solely based on the value of the outcome but instead the perceived value.
Another effect in economics that can be explained by Prospect Theory is the Disposition effect. Deep down we feel good when we win and bad when we lose. If a stock that we have invested in goes down, our decision making gets affected by fear, pride and impending loss. A "rational" investor would sell the assets in order to cut the losses. However, it could be at the cost of long term gain. There is an even greater likelihood that we won't sell an asset that has gone up, well, because it has gone up.
is both a company and an instruction set used in CPU's. ARM and x86 (used by Intel) are sets of instructions that a CPU can understand and execute. This depends on the architecture the actual silicon is built on. Hence, synonymous to CPU architecture. While Intel CPU Cores are directly sold to manufacturers. ARM, the company which might be by Nvidia, licenses the standard for other companies to design chips for their own devices. This enables manufacturers to build custom hardware tailor-made for their application. ARM offers the possibility for . This allows CPU's to have different cores specifically built for different applications. A common example is a device with a multi-core architecture with specific cores for running machine learning applications.
Apple's shift to ARM processors for the Mac line-up were in the making for quite sometime. Maybe not in the form of chips for laptops. Apple have always been a proponent of building both the software and hardware for it's devices. Now they are going all the way. They have really understood the synergies that would bring. It had substantial cascading effects to the feature-set and user experience of their devices. But hardware is hard. Apple chose which components were strategic to it's product in the future and invested in them. They were successful in doing that for the iPhones and iPad. These devices run on ARM processors that were designed by Apple but manufactured by other companies like
A moat around apple devices is the high switching costs. Once you are in the ecosystem, it becomes increasingly hard to switch to another platform. A main reason why they were able to make this work is the continuity it's devices offered. They gave seamless a new meaning with features like AirDrop. It just works. Moving all their devices to ARM processors could mean they are saving even more development cost, as now they don't have to develop for different platforms. They could migrate features and functionality on the OS and drivers they have perfected over the years in the iPhone to the Mac line-up.
From a business point of view, owning the design of chips meant that they could save a lot. Intel integrated design and manufacturing and charged heavily for their design services. Now Apple can keep those margins for themselves and outsource manufacturing to companies like.
is a Robotic Process Automation(RPA) company with a $35 billion valuation. RPA is the holy-grail of automation for heavily repetitive rule-based systems. It can emulate human interactions to execute various tasks. This automation is based on software robots that can automate tasks that usually would have to be done manually by a person. RPA is quite similar to GUI testing tools, where you can see the software mimicking mouse and keyboard inputs and recording responses from the system.
There are a few moats built into this. The first one is scale. If it works on one system, it can be replicated for all systems in a business. With low marginal cost. The second one is learnability. These systems typically use computer vision algorithms to detect user interfaces to correctly determine the next step to be performed. These get better with time. The third moat is that it helps business to separate out tasks that can be made efficient by handing it over to the bots freeing up more time for it's employees to do the harder and challenging tasks.
A key advantage of RPA is that it can be applied to almost all kinds of businesses. Expansion is possible in 2 dimensions. Both horizontally, from one vertical to another, and vertically, exploring the depths of an industry. The best part is that what it learns in one industry can then be used in another context for another interest. Essentially, what UiPath is doing is to create a toolbox of automation tasks that it can teach it's army of robots and deploy to almost any software/tech business in the world.
In the realm of financial assets, value is a function of the cashflows the asset can produce from now until it's applicable end of life discounted appropriately to today's worth considering factors like risk, growth.
Price is determined by the equilibrium of supply and demand. More supply than demand, the price goes down. More demand than supply, the price goes up.
Most of the time, these two numbers will not be the same. In other words, most financial assets are mispriced. Both price and value are a result of different input parameters. Price is determined by market factors like news, overall perception of the company, trading. In the short term the price is a reflection of the market sentiment. While value is determined by the quality of the underlying asset. How the business is making money and what it's prospects are in the future. In the long term, the market behaves like a weighing machine, reflective of the overall performance of the underlying asset. In the short term, the market behaves like a voting machine.
In an ideal world, these systems won't be mixed. But in reality there are a few cases where they influence each other. Share buybacks and issuing new shares can have effects on the price and value. In both cases the supply of shares are being altered which can directly affect the price. From a value point of view, either of these operations are reflective of the quality of the business. If a company is buying it's shares back, it can be seen as a technique to prop it's prices up. If the company is issuing new shares, that can have direct impact on the debt it has and increase the overall cash in the company. Making it more valuable.
A bad bank is a bank that is set up to hold risky, problematic holdings of a banks. By doing so, the actual bank can clear them off their books. The transfer of assets take place at market price. Bad banks is a simple idea of moving away that bad parts of the portfolio into another one, so that it doesn't contaminate the rest. Allowing the bank to focus on it's core activity of lending. The creation of a bad bank can be done even if there are no problematic assets, but if the bank wants to shift it's core strategy.
This is a good way of reducing the risk of the depositors. But investors will still lose their money. By separating out the risky parts of the assets, it enables investors to better gauge the health of the company. Usually improving it's overall outlook and making it easier to raise money in the future.
From a systems point of view this make more sense. Divide and conquer. The manager of the bad bank can then focus on managing the risky assets. This gives more room for investment practice as they are not tied to non-risky assets. The time horizon and risk profile are quite different for different underlying assets.
Over speculation of property assets and the exchange rate of the Swedish Krona led to a majorin Sweden. By 1992, 3 major banks were insolvent. McKinsey & Company were bought in to solve the situation. They proposed two bad banks that moved the toxic assets of the remaining banks at market price. The government in the end had to bailout at a price reportedly about 2% of the GDP. However, it was a backstop measure for this turning into a larger catastrophe.
After the 2008 financial crisis, this idea gained more popularity. It was seen as a method to handle such crisis without leading large financial institutions to insolvency.the Indian union budget proposed to set up a bad bank to clean-up the bad assets in the financial system.
Circularity is an important concept within economic systems. In an ideal world, a circular economy will have minimal waste and most of the resources will be continuously renewed and reused. This systematic approach gives equal weightage to the business and the environment. Digitalization can boost a circular economies. It provides the right tools without the added overhead to keep track of different chains in the business.
Theis a good example where a circular economy can be very useful. Sustainably producing different kinds of natural fibres and dyes is challenging. Circular economy promotes the reuse of clothes instead of production. Renewed clothes can be used for all kinds of purposes, and can unlock a new business segment. Fashion brands themselves have begun to focus on durability. Making clothes and accessories that lasts. Durability is becoming a more attractive trait nowadays. It is a testament to the products quality.
Ikea launched it's own second-hand store in Sweden. It is part of their push towards getting more life from their furniture. Furniture is a bit more tricky. However, Ikea's modular approach makes it a bit simpler to replace parts of a furniture instead of the complete piece. They are looking into ways in which furniture can be taken apart and put together multiple times without any damage to the components. This will help improve the overall lifetime of the product.
Circular economies will not lead to an infinite economic machine like the theoreticalin Physics. There has to be a viable exchange of value at each step of the process.
Systems can be decomposed in many ways. Depending on various aspects the resulting decomposition can look different. If they are divided in terms of technology, you could end up with different compartments or layers classified by the underlying technology. A system can be broken down by structure or function.are a good example. A system can be broken down based on the domain. That particularly helps when a group of people and roles are required to work around the system. Makes it easier for different groups of people to focus on different areas of the system. Hiring, research, experiments all become more manageable.
Another heuristic way of decomposing a system is by identifying the atomics parts of the system. What I mean by that is these sub-units cannot be broken down any further and the behaviour of these units are fundamentally fixed. Either by definition or by some kind of universal laws. By doing this we can then express the system as a combination of these units.
This opens up to thinking about 2 aspects. How can we put these units together in different ways to reach different outcomes. Secondly, how the inherent behaviour of these atomic units can be changed. Has anything changed (new invention, better technology etc.) that can modify the behavoiur of these atomic units. Sometimes it can be easier to solve the problem on the atomic level rather than on the system level.
In the beginning of the 20th century, industrial revolution was beginning to kick business activity into pace. The economy began to gain momentum from the newfound productivity from all the machinery. That created a shift from organizations that were until then more family-run to enterprises. This trend spread across the globe. In the second half of the century, air travel became more and more prevalent. Globalization began. It became easier to do business across borders just like it became easier to travel. This led to the strengthening of conglomerates and monoliths of the corporate world. They had access to resources and capital that enabled them to expand globally.
In the last 20 years, we can see a similar trend that is fuelled by the internet. Internet companies were fragmented after the dot-com bubble. But they were in a position of advantage to get a sense of the scale of the internet early on. Those companies went on to become internet giants. Newer technology like the blockchain and trends like higher social media adoption is kickstarting a new wave in business cycles.
In the coming 20 years businesses will become more remote and decentralized. They can be owned, managed and operated in a distributed manner that was not possible before. Now theexists. Incentives to build on such platforms are limited but expanding. The true potential of decentralized businesses is still not known. We are in a state of early adopters moving to the peak of the hype.
The Dunning-Kruger Effect is a bias where people think that they are smarter than they actually are. A person with low ability is more likely to overestimate themselves. This is due to their lack of perception of the task at hand and lack of previous experience. Both leading them to mis-calibrate there judgement. If a person has tried to shoot a basket a 1,000 times, they are more likely to judge their own skills as compared to someone who has tried 10 times.
Neural networks during learning also exhibit some similar behaviour. When a neural network is undergoing training by use a dataset, it optimize it's own parameters to create a best guess for the problem at hand. However, if there are not enough data to learn from the network will more often fail to generalize the problem at hand. This would mean that it has failed to get an "average" of the problem at hand forcing it to misread real world data.
Everyone is susceptible to this.
Study conducted. Tests on group of students. People who scored less overestimated while competent people underestimated. This is common when we set out to learn something new. Initially with little knowledge we seem to believe that we know a lot more than what we actually do. But as we progress and see the big picture as to how hard it is or how vast the topic is we tend to doubt ourselves. Psychologists call this the valley of despair.
This slope is similar to the. Even on a large scale, especially in the realms of new technology adoption, a large group of people tend to follow these steps.
AdNauseam is a free browser extension that tries to trick advertising networks by messing with your browsing data. AdNauseam works locally, and itself doesn't send the data out to any other services. It comes with 3 opt-in features which includes hide ads, clock ads and block malware.
What it basically does is to randomly click on ads on behalf of it's users creating a some what balanced mix of browsing history that leads advertising trackers astray. This is known as a strategy of obfuscation. This process reduces the value of the aggregated data from the user. The second order effects include polluting the data collected as whole by these services. Imagine even if 1% of users use such a service, it can have profound effects on search results and targeted ads. They could effectively end up showing irrelevant ads to its users leading to a lower conversion rate.
In a way, this directly attacks the incentive in the whole advertising business. The reason why a company would use social companies like Google, Facebook and Twitter to run their ads is because of 2 reasons. Firstly, users spent a lot of time on these platforms and secondly, these platforms have an understanding on what each user is interested in. But if software's like this reduce the value of those platforms, it will force companies to question such platforms. AdNauseam is a start to a new era of privacy focused applications.
Green bonds are issued according to directives set by the(CBI). The CBI is an international investor-focused not for profit organization that has set the certification and assurance standards for green bonds. This helps assure investors that the proceeds are being used for the right projects.
Could this system be built over a decentralized platform. There are 2 parts to this system. The part where money is raised based on a renting activity. And the second part, where the money is used to fund projects that help the environment. A blockchain could help bring access to a larger market for raising capital. Anyone could chip in to such an initiative. The blockchain would inherently secure the transaction and keep track of where the money is routed to.
Dapp is an app built on decentralized technology. Typical apps are built by organizations or single entities even if they are open-source. Dapps don't have owners and are free from censorships. The app itself runs on a decentralized network making it almost impossible to take it down. Blockchain being the most popular of these kind of decentralized chains is a catalyst for such apps.
The application logic lives on the blockchain and would execute the same way irrespective of the environment. This gives dapps a more deterministic way of working. Dapps offer privacy to developers and is resistant to censorships. However, they are computationally a nightmare. To be efficient every node in the chain would have to execute it. It will be hard to maintain such an app and develop coherent user experiences where it is spread across multiple nodes.
has launched an app store of sorts for dapps. , a web browser, has built a web browser that focuses on privacy and information control. Instead of relying on the traditional advertising model, it uses consumer attention as it's form of "currency". Users can earn their attention token by using their web browser. is an automated interest rate protocol that can be used to develop financial dapps. Compound is a platform that allows you to lend out your crypto assets and borrowers to borrow a loan against a collateral. The blockchain ensure the security of the lending activity itself.
A deep learning network contains multiple learning nodes separated as layers and interconnected both within and across different layers to create a network. Typically deep learning models are trained by activating all nodes for each training input. Another way to train the network is to sparsely activate the network for each input with the help of a Switch Transformer. This would mean that only a subset of all the nodes would be active and the subset would vary depending on the input. Sparsely activated networks would have a constant computational cost despite the size of the whole network. The key feature of sparse activation is that it enable the different parts of the network to specialize in different kinds of inputs and problems. More like how the brain is. Our brain have different regions that are responsible for different cognitive functions. However, this also brings new challenges like load balancing. To avoid over training of some parts of the network and vice-versa.
Google haslanguage models consisting of 1.6 Trillion parameters using this technique. The nearest model in this area was that of GPT-3, which consisted of 175 Billion parameters. This gives an idea of the leverage these models have.
Content is abundant. Thanks to the internet which has created an economy of abundance. Curating content is becoming a prominent niche in the creator economy. The realization that most people don't have time. And curating is an easy, low barrier to entry way to save time for other people, i.e. your audience. Saving time for others is a form of value that you can exchange. You don't even have to be an expert in a field to curate. Spending some time on related websites and forums can help you identify what is part of the trend. Stumbling upon content that is "interesting" becomes more easy that way. The tricky part is to identify if that meets the minimum-value-proposition, whether it is engaging and whether it should be curated or not.
Getting it out to the world has never been easier. A lot of platforms exist that could help you kickstart a newsletter. Platforms likeare focused on curation as it's main form of content for newsletters.
Saving pieces of content and capturing information and data is almost free thanks to technology. Hoarding information in the form of bookmark managers that are not tended to and overflowing to do lists are more common. Simply because we are exposed to more content than we have time for. In such a world, our attention becomes the new currency for social transactions (liking a post, sharing a tweet etc.) on the web.
This is a bias that leads to a person taking credit for positive outcomes and blaming external factors for negative outcomes. In other words, anything that we do that we perceive is good or leads to a good outcome we tend to positively reinforce it with self-serving bias. The hidden problem is whether the action is actually good for us in the long term, whether it aligns with our big-picture and that of the civilization and that of society. Assessing each action for its won merits and demerits prior to the outcome can be more helpful so that the outcome is not affecting our judgement.
A concept that is tied to this bias is the locus of control. It is the degree to which a person is affected by their own beliefs vs. external influence. In everyday life this translates into when a person is thinking for himself/herself, it is said that the person has an internal locus of control. Studies show that people develop a more internal locus of control with age.
Perfect information is a concept related to game theory where every player in the "game" is fully informed of all prior actions. An example of a game that allows for perfect information is Chess. Every move made right from the start of the game is visible to both players.
Applying this to a market, perfect information is when all market participants are aware of the current market prices, their own utility and own cost functions at any point of time. Complete information would mean that all strategies, cost functions, steps of all players are common knowledge throughout the gameplay. Both far from reality, but this concept has been used to solve different forms of games.
The assumption of rationality of buyers is a big one in the case of imperfect information. A rational buyer would not buy anything of lower economic value in exchange for money. However with imperfect information and in most cases also incomplete information it is hard to assess the value of a purchase say in the stock market. In a way, the decision of a "sell" or "buy" is dependent on the decision made by others as it would affect the price of the stock.
Perfect information for all market participants would still not be enough to assume everyone will behave rationally. Even if all information is available, they might not be credible. The future can change and can render the information unusable. Buyers will operate on a layer of personal biases nevertheless. Not to confuse this with complete information. Knowing all possible market scenarios and moves of market participants , although impossible with current technology, will not lead to rational decisions. Even with perfect and complete information we tend to compensate for that in our own way.
A business model that has existed in the App economy for a while is advertisements. Businesses use ads to monetize their website/app and the users can pay a premium for a better experience without ads. More the people who use the app, more the business gets paid for displaying an ad on their page. Apps over the years have grown to serve large parts of the population and have morphed itself into an ecosystem. For example, Facebook together with Instagram, Whatsapp, Whatsapp Business etc. This in turn has increased the value of user data for a business. Data about a user, about what they are searching or talking about in Whatsapp can be then used to target ads to them in another platform. This is not something that's new. But the spotlight on how this affects the user's privacy is new.
Privacy-as-a-service model could be an additional source of revenue. But it could backfire. Less data and user-awareness for acompany could mean a poorer ad service and reach. Not to forget ads make up for the largest part of the revenue in these companies. At the end of the day, companies make money when users consume. And tracking your data is a gateway to that.
Compartmental models is a type of mathematical model that tries to simulate events and systems by breaking them into compartments. An example of a compartmental model is the material flow in a production plant. A typical production plant would have stations that are sequentially arranged along which the product is built. Each station would have one or many feeder lines feeding different components and tools. Such a model could then be used to optimize the flow of material and effort along the production line.
A system needs to be broken down into various compartments first. The relations between these compartments have then to be realized. This is the tricky part. A system can be broken down in multiple ways. This has two implications down the line. Different decomposed versions of the same system can expose different parameters of interest. Since these compartments are dependent on each other, the same parameter can have different behaviour on the same system just because it was decomposed and modelled differently.
Compartmental models are used in the field of epidemiology, primarily to simplify the mathematical modelling of infectious disease. A simple model uses "susceptible", "infectious" and "recovered" as the 3 compartments to label the population and study the behaviour. A more complex version of the model uses an additional compartment "carrier" as well to model the behaviour of a part of the population that are not suffering from symptoms but is a carrier.
There is usually an industry or legislation that drives a lot of money towards research and innovation. 2 examples to this is the photography technology and batteries. As smartphones became more and more common and the cost for storage became much cheaper. There came a push to reinvent the camera imaging technology. Technologies like high dynamic range imaging (2010), panorama photos (2012), True Tone flash (2013), optical image stabilization (2015), the dual-lens camera (2016), portrait mode (2016), portrait lighting (2017), and night mode (2019) were all improvements made as part of this. The second example is of batteries. The primary driver for battery innovation was consumer electronics. Starting from devices like the Gameboy, the Walkman to early mobile phones and smart phones. But now the push is from the automotive industry. The scale is much higher and so are the technical requirements on the batteries. There is more focus on profitability and sustainability with emphasis on not just the end product but from the whole supply chain starting from an ore to the recycling plant.
A petrodollar is the U.S. Dollar paid to a country for the purpose of buying oil. This system came about in the 70's during the oil crisis leading to high prices. The petrodollar system helped make oil prices more stable. in a way, this pegs the value of oil to that of the U.S. Dollars. The spending power of this money which is received by oil producing nations will be affected by the core inflation and other economic factors affecting the USD. Petrodollar Recycling is a way in which these USD can be recycled back in to the US. This can be done by purchasing T-bills or by contracts in American companies leading to technology transfer. Most countries maintain amainly funded with petrodollars for local finances.
We live in a time where a decent internet connection can bring the world's knowledge to our fingertips. Smartphones together with that has created this new medium where most of human attention is spent on. And knowing what users are looking at and what their preferences are is a game-changer. Well, knowing in itself is not the game-changer but the scale is. We have the possibility to look at a lot of data before making decisions. However, even with access to data, we tend to take a decisions based on our gut feeling or on someone else's opinion (on a personal and an organizational level). Simply because that's how we normally do it. There are lot of decisions that we have to make that can be done in a much informed way like choosing an employer, finding an apartment, deciding a vacation spot or even buying a car.
So even though we are quite into this new medium we haven't started to fully utilize the potential it brings. In this new medium, we don't have to physically show up somewhere to get paid. The content we create can do that for us, across the globe.
Every country and their health system tries to report the number of covid-19 cases and related-deaths on a nearly daily basis. However, the actual system in each country works differently. In some countries, a person can have symptoms, they can order a test and that can take a few days. Once the test is submitted the results can also take some time. So in a way the current tally is a reflection of a past state.
This is kind of similar to all the light coming from stars that are light years away from the earth. What we are seeing is a snapshot of the past.
Its good to keep this in mind on how reporting systems work and what the actual data implies in the real world. If for example financial reporting systems in a company is so much lagging behind that the financial statements are lagging to the actual day to day business, it can lead to a discrepancy between price and value quite quickly.
Action bias is a tendency to do something in a situation due to factors like to gain a sense of control, social norms, peer pressure etc. We feel compelled to act even if there is no evidence that it might lead to a favourable outcome. Taking a decision without processing all the information might lead us to take less effective action. In the society, the general bias is against inaction. It gives an impression of not doing what is necessary or not putting in the effort. Against that backdrop, taking premature action might look much better than it actually is. This bias can be considered as a survival instinct. Our innate instinct to hunt, find shelter have carried over despite the very different environment and lifestyle today.
An example of this bias is when a person would choose to take a medical treatment as it is better than no-treatment at all. Even if the treatment haven't yet been proven to work. In meetings and conversations, we have a tendency to "say something". It doesn't have to actually add to the conversation but it gives a sense of contribution nevertheless. Charlie Munger calls it the Say Something Syndrome in his famous. This is common in investing too. Activity from your peers, general market sentiment from news, market activity all can lead to suboptimal action.
The popularity of newsletters have really taken off this year. More and more creators have identified newsletters as good way to maintain an online presence. Niche creators have turned to newsletters to build an audience. A renewed realization that an email list still works as a good marketing lead generator. The concept that a 1000 true fans are more valuable than a 5 or 6-digit subscriber count. The infrastructure and tools have become common. More democratized and accessible. Newsletters have also become a channel for monetization. The classic ads and affiliate links work. They also become a good channel to market own products and services. Paid communities tied with newsletters are becoming more common. Newsletters give a taste of the kind of community they are trying to attract, for example the.
Hygge is a Danish lifestyle centered around coziness and well-being. Acknowledging the present moment and enjoying it is Hygge. The word Hygge comes from the Norse word hug which means soul, mind and consciousness. Now hygge is considered as a core part of the Danish culture. A laid-back feeling appreciating the small joys of life. As opposed to hustle culture, Hygge is about taking it slow and eliminating any sources of stress. Multi-tasking is a big no-no. Lighting candles or a fireplace, indulging in sugary treats, enjoying a cup of coffee are the most common practices considered as Hygge. Hygge can be an informal gathering, quiet, low-key, mellow conversations.
is a web based platform that allows freelancers to offer their service to individuals and businesses. Anything ranging from simple services like a logo design to something more complex like SEO marketing. Fiverr's business model is known as Service-as-a-product. They provide a platform to effectively productize a service. This year Fiverr stock price has appreciated by 678%. The pandemic has led to a lot more in-house time for creatives, engineers, developers alike. The number of people trying to start a online business/presence have skyrocketed and has led to the increased demand for one-time one-off services. Fiverr currently has an Alexa Rank of 195, up by over 100 spots in the last 3 months. In terms of revenue, the company seems to be leading by a margin in this space.
Recently, Fiverr launchedwhich is a subscription based model allowing organizations to manage freelance resources. In a way, it doubles as a large pool of consultants who can be used as resources for projects. More ad-hoc and more varied in terms of talent and competence availability. Saving a lot of time from recruitment point of view. In my opinion, that's genius.
A system of data-manipulation rules is said to be Turing complete if it can be used to simulate any Turing Machine. A Turing Machine is a mathematical model of computation that defines an abstract machine which manipulates data based on a set of rules. The actual machine invented by Alan Turing in 1936 was based on a tape with symbols and a machine that scans it. The machine is also capable for writing symbols. This became an early version of a central processing unit. In simple words, a Turing complete system is one which you can write a program that can find an answer. A Turing Machine can solve any problem that can be coded. Most programming languages are Turing complete. Even software like Microsoft Excel and Powerpoint are Turing complete. A blockchain can be used to solve problems by embedding a scripting language that can utilize the distributed nature of a blockchain.
A piece of content that changes everything. A new idea so revolutionary that changes the worldview on some topic for a large group of people. A revelation or "leak" that has second and third order effects on the world. Or something simple. A blog post that actually gets you over 1000+ visits and a shoutout in NYTimes. It can go bad as well. A piece of content that stirs up a controversy leading to multiple lawsuits.
The coming years we will see more of this. The nature and scale of the internet will make this phenomenon frequent but still rare. It will be harder to spot from all the clickbait content out there. However it will be very clear to point out in hindsight which piece of content changed the course of our collective thinking.
The best user interface is the one that no one notices. It should just melt away in it's function. It should get out of the way rather than pose as a hurdle to cross to achieve something. Reducing the number of clicks for each operation is a good start. Making the response time really fast (sub 100ms) also helps. But these are just tricks. What really matters is the overall concept. The concept of what needs to be shown and what can be hidden. What is really necessary for the user?. What can be defaulted or figured out in some other smart way? Visibility is key. Whatever is shown on the screen should be easily understood and discernible. Unlike the new. Colors convey a mood. A combination of colours and movement can convey an emotion.
Extrapolation bias is a tendency to take recent experience and project that it will continue into the future. We tend to think in a straight line. And it is very hard for us to imagine or perceive future anomalies. A most common way this occurs in finance is when valuing a company. If the company had a 25% earnings for the past 3 years, we are easily drawn to choosing the forecast value for the coming years to be 25% as well. But that is far from certain. The ability of a business to create value in the past does not readily influence it's ability to do the same in the future. It is the same as correlating a coin toss in the past to another one that is going to take place. Here, we know that the previous coin toss outcome does not influence the nest one. The worst part of this bias is that it is very easy to take a decision with an underlying assumption that the status quo will be maintained in the future.
There are 2 kinds of tutorials. The first kind is the one that gives you the fish, and the second kind is the one that teaches you how to fish. During development, I have caught myself facing the same errors even after looking up a solution to a problem on stackoverflow. Understanding the framework and the problem in question is key. Tutorials failing to do that fall into the first category. One of the most efficient ways to learn something is by doing it and seeing it for yourself. The second kind of tutorials are really good at hiding the "actual" doing part within it's tutorial. Making it more of a learning session rather than just following a set of commands on the terminal, just to forget it the next time.
Operating a business incurs many expenses. But what if there is a way to turn those costs into a revenue. Amazon realized this quite early on. They have successfully turned most of their expense line-items in their income statement of 2005 to revenue generating products and services. Since Amazon were one of the first to explore and grow in the e-commerce space they had a considerable advantage in understanding what were the necessary operational services that were missing. Fulfillment of orders turned into Fulfillment by Amazon. Marketing in a way has become Amazon Prime. Backend servers and infrastructure has become Amazon Web Services. Payment Processing has become Amazon payments. And now they sell these products and services in almost all verticals and geographies. Productizing internal operations and scaling will become more common with enterprise software in the coming years.
The human genome is a complete set of nucleic acid sequences encoded as DNA within the 23 chromosome pairs in a cell nuclei. The complete set of human genome contains about 3 billion base pairs, amounting up to 750 megabytes of data. Some of the genes are protein coding genes. Each of these genes are responsible for several biological processes. All this data is stored in the nuclei of every cell in the human body. Gene editing is a method to edit the DNA of a living organism. Editing the DNA can lead to the changes in traits in the organism. CRISPR, a tool invented in 2009 has made it a lot easier to edit genes. Animals share a lot of genes. Mice and humans share about 85 percent of genes. Genes are changed on the mice and the effects are studied. This will help better predict the effect on humans. In this case, it has applications both in understanding a new disease or understanding the effect of a medicine. Gene therapies is a way to prevent and treat diseases in humans.
Egrodicity is an idea where a point of a moving system will eventually have travelled through all parts of the space that the system moves in. Like smoke in room and it eventually filling up the room. The egrodic theorem has been used to describe common-sense phenomenon like mixing of liquids from a mathematical point of view. An egrodic process is a process that has the same time average and ensemble average. A time average of a process is the value of the process over an amount of time. While an ensemble average is the average of N number of identical process at a specific point of time. An egrodic process can be understood by looking at a snapshot of the process rather than the complete picture.
A Freudian slip is an error that occurs due to an internal train of thought. It is said that this happens due to the interference of the unconscious. It is characterized as a slip because it is an indication of what you are thinking behind the scenes. Sometimes these slips can be quite revealing to you and to others. A peak into what you believe, what you wish and what you haven't addressed yet. At other times, these slips can be a way for your mind to tell you it's standing on something. Verbal errors are not random, they are puzzles to be solved.
If a text engine like GPT-3 is trained with a lot of text concerning to a person. Maybe all speeches of a famous personality. And the engine writes a speech posing as them. If some parts of the speech or even a sentence didn't make sense, it would be easily discarded as an outlier. But what if that is also another indication of a Freudian slip. Can this trait be replicated in systems like these?
This is one of the most common ways we learn and grasp new ideas and concepts. Right from an early age we are taught by examples. Our brains are good at making an analogy and generalizing among the examples we see. However, in the real world this can backfire. Looking at history gives only an inkling in that respect as to what the outcome of a future situation might be. Learning by example is inherently flawed for random systems as we are only exposed to a limited number of samples. It is common to reach to conclusions immediately based on what we know. But depending on the system or entity that you thinking about, it might be good to revisit the conclusion. When it comes to mathematical equations or the laws of physics, analogy usually works. The underlying systems are stochastic and predictable. But if you are dealing with the stock market or a relationship. Especially, in the topic of humans. Humans are messy. It is rare to see situations and circumstances repeat itself. It is even rarer to see people reacting to it the same way as last time. Think again, might help.
The simplest explanation is usually the right one. Also known as the Occam's razor or the law of parsimony. The Occam's razor is used a heuristic in Science to develop theoretical models. In the scientific method, you could reach different conclusions based on the evidence from the experiment. But using Occam's razor in this case wouldn't be that useful. It is not axiomatic that the simpler one should be the right one. So taking that into consideration, the Occam's razor could also be stated as "If all else remains equal, a simpler explanation is preferred over a complex one". What is a simpler explanation to one may not be the most simpler one for someone else. Maybe the Occam's razor is another way of saying what is the most simpler one is the most rational one. The option that agrees with all the knowledge and logic that you already have.
Availability Bias is a tendency to over-value something just because you were able to recall it or you remember something well. The underlying assumption is that if we remember it so well, it must be really important. That can be far from reality. If we see 2 recent stories of airplane crashes in recent times, we would perceive a higher probability for another one to occur. One part of this bias is how well you can recall a similar incident or related memory. The more recent such a memory is more chances that you'll recall it better. The second part is the frequency. How many such incidents can you recall. That has a direct effect on what we perceive. In both cases the ease of retrieval is quite important. You would easily remember and retrieve an accident that you were a part of compared to another one that was narrated to you.
This law states that, in a production context, the output of the process will marginally decrease with an incrementally increased single factor while all other factors remain constant. One example of this is, hiring new employees to a production pant while keeping the machines and floor area constant. Initially the outputs are increase but after a certain point it will begin to stabilize. The limit won't be the number of employees but rather other factors. In a way, the law also sates that in most production and economic processes there is always an optimal point. And as soon as that is passed the returns begin to fall. From a business point of view, it is key to understand what parts of the business process are closer to that stable point and what factors can be improved.
We spend a lot of time on laptops and computers, maybe as part of work or for leisure. There are a limited number of ways in which you can interact with it. You can type, use a mouse or a touchpad, talk to it, use gestures. Out of these the one that is the most common is a combination of keyboard and mouse. That is how we were taught to use a computer. Maybe that is because we were the early adopters but that is not the case now. Kids are born into a tech-rich environment around them like never before. They are much more familiar at a very early age. Now the question is whether typing and using a mouse is efficient. It turns out that typing alone is the most efficient way to interact with a computer. An above average typist can type 70 words-per-minute(WPM). If you really spend some time honing that skill you can even achieve a 100WPM. The average WPM for speaking is 150.
This is nothing new among developers. It has been long since they have figured this out.a text based editor built for use from a terminal. Vim uses a combination of modes and commands that allows you to manipulate text at the speed of thought. The interesting thing is that, most apps we use are not at all optimized to be keyboard-first. All applications have keyboard shortcuts but they are not built into the product such that it becomes second nature to its users. has taken a stab at solving this for e-mail. They have built the complete application and workflows around keyboard based triggers. This trend is still at an early stage. The prediction is that there will be more and more attention to keyboard-first applications(Superhuman for X). There is a growing market for software and techniques that can improve your efficiency around using day-to-day software. Keyboard-first will eventually become the norm.
Not everyone generates the same returns from investing. They differ based on the decisions they take. That is why it ends up being a zero sum game. You will make money when you are able to buy at a lower price and sell at a higher price. And at both times you need to have another market participant on the other side of the transaction. However, there are a few ways you can have an advantage.
Irrespective of all edges, there are always upsides and downsides to an investment. And balancing it out is an art. It could either be a quantitative or qualitative measure on whether there is a sufficient margin of safety for a particular investment.
Startups that rely on data models and AI/data science algorithms for running their business will only get better with time. Moreover, as time passes they have leverage over newcomers. Making it really hard to overthrow a leader in a vertical. In this space I think there can be 2 strategies. Either the company can choose to niche down or expand to other verticals. If they niche down, over time they will collect niche specific data and develop models that are effective in that small space. One way such a company can be overthrown is if there is a new technology that doesn't require as much data to predict consumer behaviour or whatever. While it sounds possible that still seems quite hard. Even if models get efficient and computers become more powerful and be cloud-first. There is a bottleneck in available data. There is only so much real world data that you can harvest. Out of that only so much that you can label and sanitize it so it can be used for training purposes.
Product studios is a 21st century ecosystem that supports creativity and innovation. They could be the next step for the startup world. With apps available now that makes it easy to build applications, websites and services using intuitive and graphical tools the cost for trying something out has gone down. A product studio formalizes this loop of trying something out from an idea to a product or service delivered to a customer. Studio's like this will become key in democratizing and spreading the word that it has become so easy to do something on your own now.
Big companies that still innovate on every level of it's organisation knowingly or unknowingly have built product studios into it. Some companies call this "culture". Building fast, testing fast, learning fast. When these principles become a part of the company's DNA, you are not stuck in long meetings trying to convince managers and executives.
Companies are having more and more access to data. Data from suppliers, data from intermediaries, data from customers and prospective customers. Traditional companies are not built in a way to make use of all this data. The data has to go through multiple steps and different handlers before business insights can be generated. Typically, the team that handles the data pipeline is buried somewhere under "Engineering" or "Research and Development". However, the users of the data are spread across every vertical ranging from sales, marketing, logistics and even human resources. Don't forget the fact that the company itself generates a lot of data everyday from its operations and employees. The crux of the problem with traditional organisation is that data is harvested and probably can be used in most parts of the organisation/ However, not everyone is well-versed with how to play around with data and make sense of it. The whole process of using data to make better decisions isn't a one-off ceremony. Rather it is a feedback loop of varying cadence that needs close attention.
API based businesses are a game-changer. They let other business focus on things that differentiate themselves and use more "off-the-shelf" solutions for other parts of the business. APIs have permeated into many verticals like payments(), e-commerce( ), messaging( ), search( ), automation( ) just to name a few. APIs form an integral part of the SaaS model. This is the glue that holds different pieces of a SaaS software together and yet it can invariably scale. One reason why they can scale so well is that these businesses that provide an API are focused to do one thing right and really well. So well, that their solution has all the bells and whistles to support all kinds of businesses across verticals and geographies. Then the question is can companies like that have a moat around them? What is it that gives them an advantage. Since these API's are done on a software level, in theory it would be quite easy for a company to switch to another API provider down the line. The friction would be much less compared to changing a hardware supplier and the switching costs would be lower. Then one of the most important deciding factor for customers to stick to an API provider is integrations and network effects. The more a customer uses their product the better it gets. API companies are constantly improving their services and leveraging all the usage data and statistics to build a better product. In effect, the product will only get better and that is the advantage API companies can build and serve as their moat.
Systems behave differently based on its environment. With differing inputs, the system can have a different set of outputs. Sensitivity analysis is a way by which we can analyze the effect of individual inputs on a system. Sensitivity analysis helps us understand which inputs have more effect on the system than others. This concept is used in system identification, control theory and even in marketing. Ads are usually A/B tested in a smaller group to understand what features of an ad appeals better to users before it is rolled out.
A heuristic way of applying this is, if all else are equal a car that travels a longer distance will burn more fuel than a car that travels a shorter distance. This gives us a relation between distance and fuel burnt. However, there are other factors to consider in real life like if the car has the different top speeds, drag, load/mass etc. But what we are trying to do effectively is to abstract all that out of the relation to understand the relation between fuel burnt and distance travelled.
Regret minimization framework is a framework used for decision making that is trying to reduce future regret. But regret comes from looking back. When we compare what we did with what we should have done. Of course, it wasn't obvious at the time. But in hindsight it became clear what had to be done instead. One way to mitigate this is to look at people who have gone through this before and learn from those experience. Learning to listen to your intuition and gut feeling is also another trait to master. All the information, first-hand experience and second-hand experience is all broken down and distilled by your unconscious mind. This gut feeling is how it is trying to tell you what decision to take for a more favourable outcome.
AI and ML techniques have made big leaps this year. With AlphaFold solving the protein folding problem to the GPT3 engine that can solve most text related problems using its language model. Computer vision techniques have improved as well. All these research has some drivers behind it. And usually these drivers are based on a research question, grant or a niche. But AI research in the area of understanding real-life context is hard to come by as there is no beneficiary to such a model right away. The Google Assistant is close. As the assistant tries to look at multiple sources of data in different forms like previous searches, calendar appointments, email etc. to make a better decision and provide a better reply. These kind of models require something that can account for the great randomness that is humans. The same set of calendar appointments and search inputs could still have a different meaning and level of importance for different people. Rightly judging this based on more second-order inputs like usage pattern and key strokes might be the next big step.
Group buying is when a product is offered at a reduced price if the product is bought by a minimum number of buyers. The concept of group buying can be traced back to China. The same idea is taking over e-commerce., a Chinese e-commerce platform is bringing this idea to the mobile-first world. The WeChat platform in China is crucial for enabling such a buying experience. Buyers can easily forward good offers to their friends via WeChat and if they ended up buying to together, both would receive an offer. This is takes into account of the use-case where you would like immediate feedback from your friends prior to a purchase. Amazon and other e-commerce platforms do not account for this. This well crafted feature has led to the embedding of virality into the e-commerce buying experience.
Warehouses will become the key to e-commerce growth and dominance. To run an e-commerce company there are a few key things to nail.
Protein exists as achain consisting of amino acids linked by a peptide bond. These chains can be in random shapes. But what makes them biologically functional is the shape it takes up when they are acted upon by a ribosome. are found in all living cells and they perform protein synthesis. The exact of a protein is crucial for its function. There can be parts of the functional protein that still remains unfolded. From a physical point of view, the unfolded state has the highest entropy hence it's natural state. There is a certain amount of energy that is required for the folding to happen and reduce its entropy. Folding can be triggered by hydrophobic reactions. Molecular chaperones are a class of proteins that aid in the correct folding of the protein. Neurogenerative diseases have been linked to the misfolding of proteins.
The process of protein folding is not. There are many variables build into this process that makes it really hard to predict the correct functional folding for a protein. It is estimated that a natural protein can have 10300 possible combinations. It is not truly random but rather a model that can replicate this process is . Understanding this fundamental biological process can help in developing new medicines and managing new environment.
Wrapped is Spotify's year end marketing campaign. This year's campaign was unveiled yesterday. Spotify collects and distills a year of your listening pattern and history to present the top artists, genres etc. for the year. As per Spotify's Q1 report, they have about 286 million (MAU). That is a lot of data to process behind the scenes. This year's Wrapped has some new
The Wrapped involves both distilling data from each user but also presenting it in a meaningful way. Anfrom last year gives a bit of insight on what happens in the background. Last year's wrapped had an even larger scope where it wrapped the whole decade instead of just one year. They use a data lake backed in which is optimized to aggregate data over an arbitrary period of time. Even though the amount of data was much larger, Spotify was able to reuse previously executed jobs. Each user had a row in the Bigtable with each column having result from each year. Decoupling the data processes improves the overall efficiency. User summaries broken into smaller data stories and workflows allowed for a more flexible system. Insights like songs that you might have missed would need a recommendation system that uses these data stories as inputs.
Alexa Rank has become synonymous to THE ranking of websites on the internet. This ranking is a direct indicator of it's popularity. It estimates the average daily unique visitors to a website. The estimate is made from the data that is tracked through one of it's browser extensions. Alexa Internet is the company behind this ranking and it is wholly owned by Amazon. Google also has it's own ranking system based on Google Analytics. It is seen that the rankings of the same site can vary quite a bit between these 2 rankings. One reason could be that not everyone uses an Alexa browser extension. This indicates that the sample population is much smaller. However, Google uses a much more broader data source, it's search engine and browser.
Websites are becoming home to people's work and lives. As more an more personal websites and blogs surface, maybe Alexa could be the ultimate indicator of social status in the internet. But the scale of the internet dilutes this to a certain extend where this does not provide much significance to website owners. However, the ranking of a website on a search result page has become more of an indicator to popularity for personal and niche websites.
It isthat cloud computing will account for 13% of world electricity consumption by 2030. A prediction is that, in the future the computing power wont be the bottleneck or the parameter to optimize. It will be energy consumption. Maybe that will even become benchmarks for developing state of the art AI/ML Algorithms. With chips going down to 5nm and server grade hardware pushing its limits as well. The performance bottleneck will soon be insignificant compared to the energy tax each iteration of an algorithm would take.
Energy will become the major operating cost for all the data centers and cloud servers. That will be the one parameter that can make these companies like AWS run more leaner on a daily basis. 2 possibilities than. Either there has to be a breakthrough in how we fundamentally store, retrieve and erase data on physical medium or how we carry out computations. The latter would prove more useful as it is the energy heavy component among the two. Logic in memory is a hybrid approach that combines both aspects and can save energy. The second possibility is that we have to figure out innovative ways to counter the energy problem.has claimed that underwater data-centers are a viable option.
They both are similar in some aspects.
But these differ as well. In the newspaper industry there are 3 main players; the editorial, the advertisers and the readers. While in equity research there are research analyst firms and institutional investing firms. The business models are quite different.
This is an index issued daily by the London Stock Exchange. It directly measure the demand for shipping capacity versus the supply of dry bulk carriers. The supply of cargo ships are quite tight as it takes a lot of time to commission a ship. It is more costly to park a ship for some parts of the time. So marginal increase in the demand can push the index quite high.
The BDI is a practical economic indicator on a global scale. THe BDi is also a measure for what it costs to ship raw materials like iron, steel etc. The index is calculated by calling ship brokers around the world to see what the prices are for 22 different routes around the world. Ships are divided based on their capacity.
BDI is a very visible indicator of demand for materials and commodities globally. It is a simple index and is difficult to manipulate. Since the index is directly calculated from prices around the world it is driven by supply and demand, it is shielded from the involvement of government, speculators and other key players. BDI usually goes up when the economies around the world are growing, the demand and there in the prices of commodities grow, companies and stock prices around the world continue to increase in value
Spatial computing can be considered as an extension of IoT for objects in space. It is a concept that includes various technologies like virtual reality, augmented reality and mixed reality. AR headsets like theuse "spatial computing" to interact with real objects around them. A futuristic scenario is when the objects around you are not only connected to a common media like the internet but also orchestrate themselves to achieve a higher level goal. Or if you could interact with objects that are thousands of miles away or fix machines that are in the ends of the planet.
For a technology like this to manifest there should exist a high speed, low latency network that will allow this kind of high bandwidth communication across multiple objects and the subsequent processing of that data. Another prerequisite could be the need of a common language that these disparate objects and underlying technology can use to talk to each other. Lastly, the use-cases need to be simplified and dumbed down to make it profitable and viable for development and innovation. A use-case that is quite common now an array of sensors spread in the aisles of a supermarket to better understand the behaviour of the customers.
According to the theory of monogenesis, all pidgins and creole languages of the world can be traced back to one language. A pidgin is a language that has lexical features from 2 languages along with simplified grammar and a smaller set of vocabulary. This is most commonly found in African countries with numerous ethnic groups. Each group have their own language. English based pidgin is used among certain communities to communicate with each other. Pidgin languages are fluid and change over time. A creole is a stable language that is formed from several languages. English-based creole languages are common in the Caribbean region. European language spread patterns have been connected to the early colonial ties with different parts of the world. Language spread and diffusion occurs as large groups of people move around and settle in other parts of the world. In a away, languages and the mix of it that occurred over the years provides a better picture on how people moved around.
A bubble is an economic event where the market price of an asset rises rapidly. Usually, this sudden rise of price is followed by a crash or a drop. Like most market phenomenon, this too is driven by market behaviour. A bubble can occur at any time. During a bubble, there is a large transfer of money to parts of the market experiencing rapid growth. Once the bubble has burst, the wealth is transferred back.
There can be many reasons as to why the market behaves this way. The dot-com bubble of 2000 was fueled by the collective illusion that internet based businesses will have very high leverage and low costs to run a business. However, that was not the case. Making an internet business viable was hard. The speculation led to a huge increase in prices of such companies.
Fear index is the popular name of the Chicago Board Options Exchange's CBOE Volatility Index(VIX). It is a 30-day forward looking volatility index and is used as a measure for the expected volatility by the market. Volatility is the level of price fluctuations that can be observed in past data. A simple way to gauge it is to look at the standard deviation and variance of past data which tells us how much was the spread. However, the fear index is a measure of future volatility, also known as implied volatility. This is calculated by looking at option prices. A price for a call option on a stock that expires in a month is an indication on the probability and the general expectation of the market for the price of that stock in a month's time. VIX is constructed using implied volatilities in index options of the S&P Index.
hit it's highest intraday level since 2008 on March this year. It had a 280% surge and that is primarily due to the Covid-19 outbreak. However, since then it has settled down to almost pre-Covid levels. The VIX provides a direct way to cash in from the general fear and volatility of the market.
Anchoring bias causes you to get distracted by the information that was presented first. Even if that is not relevant to the task at hand. Information that follows will also be judged relative to what was presented first. The first piece of information has now become anchored in your thought process. This happens when you look at the price of stock. When you buy a stock at a price and eventually gives you higher return, anchoring kicks in. Even though the fundamentals of the business hasn't changed, you wouldn't want to add to your existing position because the initial price is what you are anchored with. When looking at deals, we are trained to compare it to see if it is a bargain or not. And by default, the first price is what the comparison is done against. This happens a lot in sales as well. The salesman almost always starts their pitch with a much higher price. Thus anchoring the buyer to that price. So any discounts the salesman further makes looks like relatively a good offer for the buyer.
From the periodic table, it is easy to dismiss the elements that are gaseous or liquid in normal pressure and temperature. They wouldn't make a good currency anyway. Then comes the alkaline metals in the periodic table. They are too reactive to be considered for everyday use and exchange. And the same applies to radioactive elements. Then comes elements like titanium, zirconium. Very durable and stable. However, it required specialized equipment back in the day (circa. 1000BC) to extract these elements from its ore. Aluminum was hard to extract as well and Iron rusts. Then we are left with a few elements that are known as the noble elements. "Noble" because they don't easily react with other elements and that means they occur in nature as itself. They are Platinum, Rhodium, Palladium, Iridium, Osmium, Ruthenium, Silver and Gold. A problem with the noble ones except for Silver and Gold is that they are so rare and would be very hard to forge coins out of them. Silver tarnishes over time, but Gold relatively does not. This inertness is one of the main reason why Man has used it over thousands of years as a form of currency.
Back in the day, currencies were tied to Gold. Meaning that each paper note of a currency is backed by and equivalent amount of physical Gold. But during the Great Depression, countries severed this ties so that they can print more money and reflate their economies. More recently, the US stopped using Gold as a backing in 1973. And with a steady supply of Gold and varying demand, it can swing the price quite a bit. The bottom-line is that gold inherently did not have any value. The society during different times in history assigned it value. This could be the reason as to why many still use Gold in their portfolios.
Temperature inversion is a meteorological phenomenon where a deviation occurs from normal change of an atmospheric property with altitude. It usually refers to the inversion of the thermal lapse rate. The lapse rate is the rate at which temperature falls with altitude. The temperature drops as you gain altitude due to the lowering of pressure, following the ideal gas law. An inversion would mean that the temperature would be higher at a higher altitude than it is in a lower altitude. This traps air pollution like smog closer to the ground. Inversion can occur when a warm less-dense air mass moves over a more dense cold air mass. This in turn can cut off any convection in the region stopping the air form circulating. As a consequence of inversion, it is seen that radio waves can be refracted making it possible to receive radio signals from long distances.
Of course, this is generalized and over-simplified but it gives a framework on how to think about early stage businesses.
Metacognition is "thinking about thinking". It is being aware of what and how we think and why we think in a particular way. This is a skill that can compound very well if used properly. Metacognition is critical to learning. We need to engage different ways of learning for different type of knowledge we are trying to acquire. Reflecting on the very steps of learning and the acts involved is part of metacognition. Spending time on reflection after tasks that were particularly testing the limits of your ability is a good way to practice metacognition.
One key aspect of metacognition is self-regulation. This can be split into 3 parts; Planning, Monitoring and Evaluation. Planning involves trying to understand the problem and choosing the right set of strategies and tasks. Monitoring involves using your own awareness to judge during the process how effective you are. Evaluation refers to appraising the final goal compared to the outcome of the task and figuring out how it can be improved the next time around. This fits well into. The mental models are the set of tools you would have to use to solve a problem. Each time you perform a self-regulation you are in a way sharpening some of the tools bit by bit.
This is bias where there is a tendency to reach a conclusion based on the first stimuli we get. This tendency may also be attributed as an energy/saving behaviour. Once we think about out first conclusion, we have given it more ammunition. This can lead to 2 things. Firstly, we will stop accepting newer ideas and viewpoints into our thinking. Secondly, we try to reason out why the first conclusion is the right one. A mental routine to counter this bias is to acknowledge this and to think inversely. Where you assume that the first conclusion HAS to be wrong. This will allow us to ask more objective questions rather than settling in with the first conclusion.
An example, we see a shelf in the supermarket that is empty. Our first conclusion might be that the particular product is currently trending and it is being "over-bought". But there could be other explanations as well. Maybe the product was recalled due to some defect. Or the shelves were emptied as they were not being sold at all and it was closing in on its expiration date.
This is a bias that occurs mainly in media. With lowering attention spans, media institutions are forced to summarize and condense news to make it more appealing to its readers. But summarization comes at a cost. The content loses context and that can have a big effect on how the news is perceived by the reader. Summarization also means that something has to be omitted. Different view points can portray a news story in different ways. The choice of omission can depend on the institution and the biases built into it. The other end of the spectrum are books. They offer a lot more context and background information on a particular topic or issue.
This bias has aas well. Politicians and campaign managers can focus on certain issues and leave out nuances. This then trickles down to other media like Twitter or Sound-bites. Concision bias is at the center of the whole "right-leaning" and "left-leaning" media discussion. They lean by leaving out some parts of the story.
A sovereign wealth fund(SWF) is a state-owned fund that invests in real and financial assets. Countries that accumulate funds by operating the government, the banking system and from exploitation of natural resources can use that money to buy assets through a sovereign fund. They are created when countries have an economic surplus and have low international debt. Countries use this fund to maintain its social welfare system, for example the worlds biggest SWF, the. For countries that are generating most of its income from exporting non-renewable resources, an SWF is a way to park the revenues. It shields them from the cycles of the market and also paves way for a more stable income in the future. If designed properly, they could acts as a buffer for the country's economy. Resource revenue can be highly volatile and dependent on the finiteness of the resource and unpredictability of the extraction
An SWF can be used to buy key and strategic assets in foreign industries thereby gaining control. This is concerning in a global context. It becomes an indirect way to exert control over other countries. An accountability framework is key for an SWF to function. Large amounts of money in the wrong hands is not something anybody wants. Especially where governments are unstable.
Governments have expenses through a fiscal year to maintain its operations. They include defense related, social security, healthcare, public services like post, museum, libraries, public transport, ports etc. Governments generate revenue from activities like public and corporate taxes. When the expenses is higher than the revenues, the budget is in a deficit. US Debt is considered as one of the safest form of debt worldwide and is held by many economies globally. Countries across the world can own a piece of this debt by buying treasury bills. Every year the government is entitled to pay interest for the outstanding debt. Thus making it affordable for the government to continue expanding the country's economy. Government debt is usually considered as a safe instrument since the national reserve can always print more money to pay the debt. But in the process also devalue the currency and leading to more inflation.
LiFi stands for Light Fidelity and is a communication technology that uses light to transmit data between two devices wirelessly. Transmission can occur in the infrared, visible or ultraviolet spectrums. However, since they cannot penetrate walls and can get attenuated by the medium, LiFi is limited to short distances. Light reflected off walls can be used to extend the range. Light does not get affected by electromagnetic fields and this makes it suitable to use it in conjunction with other systems like radar.
Every light fixture in the building can contain a modem and chip to facilitate the communication. Since the light flickers at a very high frequency, it will not be visible. The fundamental requirements on the hardware to set up a LiFi network is quite small making it a viable candidate to scale. Every light based IoT device could possibly be used as a node in a LiFi network. Although they are limited to confined spaces, without much interferences from the sun. If that can be solved in some way, there is a huge potential for disruption. Traffic lights and vehicles in the vicinity will be able to talk to each other. Aircrafts and the ATC could use it as a communication channel. Cameras can be connected to all the lighting equipment in a movie set. Then the important question to ask is if the hardware will become so cheap that it would be a better option than using a BT or any other RF based hardware and software stack.
Simple intelligence is cheap. It can be replicated with minimal hardware and a few lines of code. It can be as simple as a self-adjusting thermostat. Embedding intelligence into everyday objects is not a new concept. Theconnects these local devices to each other. It introduces a medium to use and a language to communicate with. Taking this one step further, the devices not only talk to each other but they talk to a central server that monitors them and tries to identify usage patterns and behaviourial quirks of the user. This makes it more of a collective intelligence. Even though "recent" developments in web and networking technologies and with the excessive number of abstraction layers, it seems to be seamless on the surface. However, underneath the hood there are still many interfaces, handovers, and conversions taking place.
Technological development of this scale spans over many years and is spread across the world, split between large companies, startups, government regulations and various standard setting organizations. But if we were to start from scratch and built it again, would we choose the same interfaces and handover mechanisms. Or would we go for something much more simpler. Considering not just the top level abstractions likeand other API's of that sort. But the complete SW stack right down to the metal.
This is a bias of concentrating on past events that made it past a milestone or a selection criteria and not considering the rest. One easy example where this is seen are startup stories that we see from biographies from silicon valley founders, podcasts like, etc. There is a huge bias even in terms of the type of content available which are skewed towards success stories. One way to think about this is that, search engines are trying to serve up what we ask it to. Or maybe that when we try to learn something new, our natural tendency is to look at successful cases and try to replicate them or learn what they did. Rather than looking at failed examples and learning what NOT to do. This goes against the grain in how anything is taught in school and in colleges.
Radioactive molecules lose energy by radiation from its nuclei. This is known as radioactive decay and is aat an atomic level. It is impossible to predict when an atom will decay but for a significant number of identical atoms it is possible to predict the average amount of decay that would occur. The overall decay rate is known as the half life. In other words, half life is the time taken for something to halve its quantity. An element can have molecules with different number of neutrons. These are known as isotopes and different isotopes have its own half life. Carbon-14 has a half life of 5730 years. It is commonly used to organic material.
This concept so far is pertaining to physics. But can beideas as well. Ideas and knowledge have a half life as well. What we learn today might not be true in some years. When a field is in its growing stage, the half life is quiet small. Indicating new theories are being found that disproves existing ones. In the past it used to take longer to disprove something and change the commonly prevailing notion. Now with a low cost to spreading and sharing ideas, the half life has reduced even more. Secrets also have a half life. It is becoming harder and harder to keep secrets with the exposure to a global medium like the internet around.
Product-Market fit is a stage during the initial days of a startup when the product and users align.characterizes it as when the user start talking about your product to others on their own. : Product market fit is when you build something that people want.
Startups take different time to achieve this depending on the problem it is trying to solve and the industry that it's in. Before a startup reaches product market fit, it will iterate through multiple solutions based on the feedback from different stakeholders. Shorter this feedback loop is, theit is. The tricky part of achieving product-market fit is that it is a 2 dimensional problem. Each time you tweak the product, that in turn also shift the market for that new solution.
Carbon pricing puts a fee on activities that leads to greenhouse gas emissions. This is a way to incentivize businesses to make the right decisions for the environment as it will directly affect their bottom line. The price in itself reflects the effects of emission, i.e. costs to public, rising temperatures and sea levels, lower air quality etc. This can be used to generate revenue in the economy and can be used for other green initiatives, thus creating a virtuous cycle. In a way, everything businesses do that are harmful to the environment are subsidized- The associated costs doesn't affect their bottom-line. Carbon pricing is a neat way to bring that back into the economic machine. This can also incentivize where investors put their money in. The carbon price could make it more expensive to run conventional companies, making green-tech companies more attractive.
Game Theory is a theoretical framework for evaluating and analysing social situations among competing players. Game theory can be used in real-world scenarios and their outcomes can be predicted. The central piece of the theory is the game, which acts like the model. And there exist players with different value vectors that are dependent on the actions of other players. The actions of other players can be classified and boiled down to strategies.
Within finance, a phenomenon that was noted early on with IPO's that some stocks had a significantly positive run. Game theory provided an explanation that there existed two groups, one that knew the true value of the stock and the others that were still uninformed. The group that knew the true value would only buy it when it was at or below the true value. While the uninformed group would then be left with overpriced blocks.have been used to explain different aspects of finance. However, with more information and data being available behavioural models are being used to explain what game theory couldn't. This can be due to the face that game theory assumes a certain degree of rationality from the players, while that might not be true in the real-world.
Monte Carlo simulations are used to determine the probability of an outcome from a model by using random variables. When the model contains a variable that is uncertain, Monte Carlo simulation takes that variable and assigns it a random value. Based on repeated runs of the simulation the end result is than averaged to provide an estimate. In many cases, this has proven to be more accurate than "gut feeling" and other soft methods. Since this model can output different outputs for the same input due to the random variable interference, it is a. This method works when the model contains many coupled variables. The repeated simulation of the model can uncover patterns with varying inputs for the random variable.
Of course, a model can only predict and account for whatever is built into it. If there are inefficiencies and non-linearities that were approximated to simplify the model, that will be reflected in the outcome as well. Monte Carlo simulations have applications in a wide range of fields including, statistical physics, oil exploration etc.
Batteries have always been a limitation for electrifying devices and vehicles.
"Today, batteries account for a substantial portion of the size and weight of most electronics. A smartphone is mostly a lithium-ion cell with some processors stuffed around it. Drones are limited in size by the batteries they can carry. And about a third of the weight of an electric vehicle is its battery pack. One way to address this issue is by building conventional batteries into the structure of the car itself, as Tesla plans to do. Rather than using the floor of the car to support the battery pack, the battery pack becomes the floor." -
One way to increase overall efficiency is the reduce the weight of the battery storage. This can be done either by embedding the battery within the structure or making the structure itself the battery. In a structural battery, the cells have to be molded into the shape of an aircraft body or a smartphone case. But having structural batteries is a huge safety risk. A crash or dent could potentially set of an unstoppable chemical reaction. Aviation is a hard industry to electrify. Simply because the fuel used is 40 times more energy dense than typical lithium batteries. This would mean the airplanes would end up being really heavy. Embedding batteries into different parts of the structure isn't as efficient as making the structure from the battery itself. New combinations of cell chemistry are being researched upon, where the electrolyte is a semi-rigid polymer resembling cartilage. And these cells could potentially be embedded into moving parts like robots just like fat. Fat is an efficient energy storage, it is distributed across the body and it serves other functions like insulation as well.
The cambrian explosion was an evolutionary event that occurred around 541 m.y.a. (read as million years ago) where most of the animal phyla started appearing in fossil records. A phylum is a level of classification in biology that is one level below the kingdom. Almost all present animal phyla appeared during this period. The event lasted for about 20 to 25 million years. A leading theory for the cause of this event is the increase in oxygen levels in the atmosphere. Earth's atmosphere did not contain free oxygen. The oxygen that is currently part of the atmosphere is the result of billions of years of photosynthesis. Metabolic functions may have been slowed down due to the lack of oxygen prior to the cambrian explosion. For example, the creation of thetissue, which is required for hard exoskeletons. This also explain why there are very little fossils prior to this period.
In our times, there are some events that can be considered as an explosion of development. Maybe the technological cambrian explosion could be the invention of electricity or the transistor. One fundamental technological change that then had profound effects on all aspects of life. Probably writing is the cambrian explosion equivalent pre-technology times. That became the primary way in which we stored and transferred knowledge across generations. Leading to an advancement based on past experiences and mistakes.
A blockchain is a list of records that is connected by cryptographic algorithms. A new record is added to this list along with its timestamp, data, and hash of the previous block. The list is resistant to any modifications as a record cannot be modified without modifying the hash of all the subsequent records. With the limitation of today's computing power that is almost impossible., a cryptocurrency makes use of this technology and was one of the first digital currency that solved the double spending problem. Here the currency itself could be used as the central authority of trust, so there was an assurance that all transaction were resolved. Another use-case for this technology is smart contracts. A blockchain network could be used as the trusting entity for contract between 2 parties and can also facilitate the transfer of money between them. Supply chain is an interesting use-case where a blockchain can provide valuable information about the product right from the raw material stage to the final product. This can have implications on trade, illegitamate institutions and enforcement of regulations within food supply chains. Health care records can also be maintained using this technology making universal and accessible health records for health institutions with the right authentication together with the patient.
As more countries adoptthere can exist an interoperable real-time payment and settling systems that can span across borders. This can be a good thing or a bad thing. International markets and prices will become more responsive and volatile. While it will encourage faster and more up to date settling of payments across the globe. In a way this will help provide a better snapshot of economies around the world. CBDC's can change the . If the central bank directly offers a safe place to park money for any legal citizen, that will by definition become the risk free rate of the country. Bank rates will be a marked up rate for the additional services that it can provide. From a nordic model point of view, this perfectly fits into the distributed state controlled welfare system. It can also make things like universal basic income and other government benefits more accessible.
It's a very simple concept. It is the net amount of money that a company keeps from its revenue. Gross margins have a lot of impact on a Software-as-a-Service(SaaS) business. 2 forms of leverage a SaaS company has are scale and lower operating costs. SaaS companies have 2 forms of leverage. Firstly, the ability to scale at a fraction of the cost of actually expanding across geographies. Secondly, their relatively lower operating costs which means that they can keep most of their revenues. Gross margins are hard to improve. They usually rise in the initial months and stabilize thereafter. As a SaaS business scales up, it is adding more inertia to the company. It becomes more complex to run a multi-feature large scale app, and this eats up from the gross margin. Gross margins affect the valuation as it is a direct indicator of the business' ability to create value. Value of a business can be expressed as the discounted sum of its current value/assets in place and the systems/products in place that can create more value in the future. And the two parts of this equation should be worked out for separately to avoid double counting.
A data architecture lays out the framework, models and rules under which the data will be stored and managed. This can be both a physical architecture like how the data will be stored in different physical servers and backends. It can also be the logical architecture, how the data has to be stored in different databases and what attributes it should have. The choice of the data architecture depends on the type of data and the application it's being used for. One tricky part is to determine the right architecture is finding a logical abstraction for different kinds of data. The common abstraction example is the layer based architecture for SW that is running on dedicated HW like Android OS on a smartphone. The data architecture also depends on how the people working on this data are organised. The teams that need to use the data and maintain it for various business operations usually have different ways of working, tools, and use-cases. By virtue of how they are organised, there would also exist an abstract data pipeline and a sequence of operations and process. This would then put constraints on the data architecture. Different parts of the data require different exposure to the outside world. Some of it is probably used for internal purposes and might be related to users and must be hidden from the public view. While data the is required to render a product page will be more outward facing.
Thestates that buying something new creates a spiral of consumption leading you to buy new things. Things that you never needed in the first place. Buying a brand new car, leading to buying car accessories that you did not need. This can be explained by breaking it down to two parts. The first is the initial need that triggers the purchase of something new. The second is the social and mental value we put on the new possession. For example, what would a new couch be any good, unless I upgrade the surrounding furniture. One reason for this could be the constant consumer favouring ads, influencers, and content that take up our attention these days. A consequence of this kind of an environment is that we almost never think about "downgrading" or "simplifying" our lifestyle. Now it's always about "more". Could this be because consumption is at the center of the economic machine and we still haven't figured out a way to sustainably grow.
in his book , listed 7 questions a startup must answer in order to make it big.
Opportunity cost is the difference in cost between a choice and its next best option. The loss of potential gain by choosing an alternative. Even though the concept arises from a financial point of view. This can be applied to almost any kind of decision making. Every decision is a trade-off and the opportunity cost is just an indication on how much we are actually trading off. Theused to compare different investment options is another application of the opportunity cost. When we decide to spend time on a task, we are inherently deciding not to spend that time on something else. That will have a consequence or a "cost" attached to it. When you choose to buy a house we are saying no to renting one, moving to another place etc. All these options have an opportunity cost attached to them.
What is a true indicator for a country's economy? The most prominent indices are often used as a proxy for the current status of the market and by extension the economy. However, the stock market only represents the bigger and stronger companies in the economy, that were capable enough to get publicly listed. It does not account for the small businesses, the out of the garage shops and the unorganised sector. They form a considerable part of the economy and probably are the first to feel repercussions from an impending downturn. They are first to lose access to capital and debt. Is GDP per capita a good indicator of the economy? Well, it averages out the difference between the 2 sides, one represented within the stock market and the rest.
Measuring national income can account for all parts of the economy, but it is a lagging indicator. It measures the state after the fact. Similarly, the yield curve on risk free instruments is a leading indicator, where it measures the overall market sentiment for a future time horizon. Some indicators are used to indicate the economic activity that occurs between the economies, for example the. Could a combination of these be used to offer a more complete picture of the economy. A starting point could be to classify all the different economic activity that happens within an economy and how each of them can be measured fairly accurately in a given time frame.
The US air force is hiring companies to buildas part of their project known as Advanced Battle Management System. “The goal of ABMS is to enable the Air Force and Space Force to operate together and as part of a joint team – connecting sensors, decision makers and weapons through a secure data network enabling rapid decision making and all-domain command and control,” according to an Air Force press release.
This project is part of Joint All-Domain Command & Control (JADC2), which is a software platform that will help them orchestrate battles on multiple fronts - air, land, sea, cyber - and even the electromagnetic spectrum. JADC2 aims to link all of their entities like ships, aircrafts, soldier etc. This can help them run operations much more efficiently. More importantly, it opens up the possibility of collecting tons of data from the field and using that to better predict scenarios and outcomes. Which in turn can be used to strategically deploy resources during an operation in real-time. Could this mean that we are building a real-world war machine that can possibly predict attacks and weaknesses of the opponent with more certainty. On the other hand, this can become an Achilles heal as well.
is a concept developed by . It involves a way of thinking that does not adhere to the typical step-by-step thinking following a logical order. Our brains are conditioned to think in a straight-forward manner. It takes deliberate effort to do some kind of lateral thinking to arrive at a conclusion or solve a problem. Traditional thinking is more vertical, where each thought is connected or related to the next one following some logic. Edward de Bono argues that thinking happens in 2 stages. The first stage is a perceiving stage, where the brain sets the environment, the constraints to think under. And the second stage, where the brain uses the environment to arrive at a conclusion.
He has also described 4 ways to practice deliberate lateral thinking:
is a machine learning technique that trains algorithms with separate local samples and without exchanging them. This enables training of an algorithm using multiple devices. This is a huge plus from a data privacy and data security point of view. The basic principle is that the algorithm is trained on the locally available data and the resulting model parameters of the algorithm are then exchanged to other instances. The "other instances" can be either centralized or decentralized. Determining data characteristics from just the parameters is close to impossible. Splitting the datasets into smaller local sets counteract the bias that maybe only seen in some data sets. Smartphones use this form of learning where a central model is retrieved from the cloud. The local data produced by the smartphone (for example, usage statistics, keyboard strokes etc.) is used to update the model. The updated model is then sent back to the cloud over secure channels. This shields the raw user data from the external cloud infrastructure.
A major field where this can be used is in thevertical. Starting from all the data that is harvested from wearables of consumers to data from hospitals and insurances. It fits into the criteria of having a very dispersed data and can still fulfill legislation like GDPR.
Compounding is a concept that primarily comes from finance and economics. Basically compounding refers to the interest on interest that you gain on a deposit or a loan. Albert Einstein once said that Compound Interest is mankind's greatest invention. Compounding goes beyond the field of economics. It can be applied to relationships, habits, learning as well. If we try to boil it down, it comes to two things. The frequency component, i.e. the sheer disciple of performing a task or sitting down to write consistently, day after day. Lionel Messi once quoted "I start early, and I stay late, day after day, year after year. It took me 17 years and 114 days to become an overnight success". The second component is the passage of time. Sometimes the change is not visible and it can be demotivating. Butis the only way to manifest compounding.
It is an effect where you experience a positive or negative effects from an inactive "look-alike" substance or treatment. The inactive substance is called the placebo. The mind has a powerful influence on the body and expectations can trigger this effect. More the expectations more they can feel it. The effects are less prominent on people who don't believe that the medicine or treatment is going to help them. Placebo effect is extensively used in medical research. Test subjects are given either the actual medicine under trial or the placebo. This will help get a more accurate idea on the efficacy of the medicine. To take this one step further, these studies can be done in a double blind way. In a double blind study, neither the researchers nor the test subjects know what treatment they are getting. This helps reduce the bias in the study due to placebo effect.
is an online money transfer service. TransferWise offers transfers at the mid-market rate, i.e. the rate that you would find on Google. Other banks and transfer services usually offer a higher rate and make a profit out of it. TransferWise uses smart matching to match your transaction to another one that is happening the other way round. This avoid currency conversion and enables them to make a profit even at a mid market rate. In addition to that, they have a network of local banks around the world to facilitate these transactions locally.
stands for Special Purpose Acquisition Company, also known as a blank check company. A SPAC is a shell company used to raise money that can be used to take a private company public. This enables company to bypass the traditional IPO process. SPAC's can be used to fund mergers and acquisitions as well. This gives an opportunity to retail investors to take part in private-equity type transactions. All the money raised from an IPO is held in a trust account. From the time of the IPO, the SPAC's have a time limit before which it should acquire an operating business and perform a reverse merger, causing the private company to become public. Companies selling to a SPAC can get a higher sale price than a private equity deal making it attractive for companies. The founders benefit from the common stock portion in the new company and the investors receive an equity interest. The downside for an investor is capped by the fact that they have a redemption option if they do not like the acquisition target.
, there were 59 SPAC IPOs that collectively raised $13.6 billion. Year to date, a total of 55 SPACs have raised $22.5 billion. Some famous SPAC IPO's include , , . Michael Bloomberg is in with hedge fund manager to take his media empire public.
is the digital form of fiat money. It is issued and regulated by the government and hence receives its legal tender status. Most proof-of-concepts are based on blockchain technology and inspired by cryptocurrency like systems. But unlike cryptocurrency, the CBDC will be centrally controlled. In this case, the concept of a distributed ledger used in bitcoin won't be required. In a way, a CBDC can be seen as the digital version of "cash" as we know it. A CBDC will not need intermediaries and clearing houses to complete transactions. Money transfers and payments can be in real-time. It can help curb illicit money activity (like tax evasion). CBDC's can provide an easy way for everyone the park money safely. This can lead to improved competition among banks to provide better solutions for deposits.
This opens up another way in which the central bank can directly affect the economy. Currently the reserve banks useand to control the circulation of currency. But with a direct method to inject money into circulation, also known as , can increase liquidity. CBDC's will counter the effects of the more distributed cryptocurrencies like and corporate digital currencies like .
This is a psychological phenomenon where your performance is influenced positively or negatively by others' expectations of you. One example of this is in the classroom. A teacher tends to pay more attention and encourage students they like and expect to perform well. This in turn creates this positive feedback loop. From the student's point of view, they feel a need to prove them right and doesn't want to let their teachers down. This pushes them to work even harder resulting in a better performance, as expected. In a way this ends up being like a self-fulfilling prophecy. We can use this effect to positively affect the people around us. Hold high standards for the people around you and that will have an effect. Having higher standards for yourself can also lead to the same effect.
A pulse oximeter is used to measure the oxygen level in your blood using a non-invasive way. Haemoglobin are cells in the blood that carry oxygen. Hence they can occur in 2 states, either they are oxygenated or deoxygenated.
The pulse oximeter basically passes red light and infrared light through the finger and measures it on the other side. Based on the ratio of the sensor readings from the red light and the infrared light the amount of oxygenated and deoxygenated haemoglobin can be calculated. Some limitations of this approach are, that the sensor has to be callibrated and light from ambient sources needs to be accounted for as well.
It is a skill to understand when to stop reading a book and switch to another one. Not finishing a book is not a sign of failure, rather it is a sign that you value your time even more. Reading non-fiction books in a serial way, i.e. one book after another is also quite inefficient. Reading a few books and jumping between them helps keep the interest level high. However, there might be advantages of sticking with a book in the case of fiction, especially when the story draws you into a whole another world and you develop a connection with the plot and the characters.
Dollar cost averaging is an investment strategy where the investor splits up the amount to be invested and invests it over a period of time. The immediate effect of this is that the effective cost of the purchase gets averaged out over a period of time. This can shield the investor from a sudden increase in price. This makes it easier for people to enter the market. It reduces regret of buying for a high price as the purchase is split over time, and if the price drops you can still buy parts for the lower price.
In a way this strategy stretches the decision time. There is more time to change your mind in this kind of an approach. That can be a good or bad thing depending on the investing style. The time horizon in which the investments are spread over also matter. For example during the Covid-19 related volatilities, it wouldn't have been wise to have a time horizon of 6 months. However, if the time horizon was 24 months, the volatility would be averaged out in the end.