1. The Past, Present, and Jaw-Dropping Potential Future of Virtual Reality

    Every life experience, from our birth to our death, can be reduced down to electrical stimulation of our brains from sensory organs providing us with information about the world around us. “Reality” is our interpretation of these electrical signals, which means that our brains are essentially our own reality. Whatever you feel, hear, see, taste, or smell is an interpretation of the world around you that exists solely in your own brain. In general, even if we understand this concept, we work under the assumption that our interpretations are pretty close to the external world. Actually, this is not true at all. In certain crucial ways, the brain “sees” things that do not actually reflect the information that is being presented to our senses. We each live in our own reality bubble, constructed of both how we perceive using our senses and how our brains interpret these perceptions. This is exemplified by the concept of color. Color in itself is not a property of the world around us; rather, it is a category created by our perceptions. To experience the world with meaning, the brain just filter the world through our lenses. This is what makes virtual reality so intriguing for the future of communication in a variety of fields.

    Now, our method of communicating our perception is with words. Words have proven to be ineffective for relaying our intentions and interpretations. With virtual reality, there is the potential for us to literally show each other way we see. Virtual reality allows us to reveal a world without our filter, which could endow mankind with a new method of communication that is a sort of telepathy, bringing the gap that exists due to our own unique interpretations of the world. With virtual reality, there is no ambiguity of what we mean like there is when we speak our intentions. This results in a truly perfect understanding, as all parties hold the exact same information. Understandably, excitement about these possibilities translates across a variety of fields. In this blog, we will look into the history of virtual reality, how it works, and its various applications.

    History

    Though the concept of virtual reality has been around since the 1950s, most people were not aware of it until the 1990s.1 However, the beginnings of this revolutionary concept started well-before it was conceived. If you think about virtual reality getting its start under the idea of creating the illusion of being somewhere other than where we actually are, it can be traced back to the panoramic paintings of the early 19th century. 2 These murals were designed to fill the entire field of vision of the viewer to make the paintings come to life, creating the illusion of really being there. Clearly, the desire to see things differently than our reality has been present for centuries.

    In 1838, scientific research integral to the development of virtual reality was conducted by Charles Wheatstone. This research showed that each eye processes two different two-dimensional images, bringing them together to make one three-dimensional image. This is how he invented the stereoscope, which gave illusion of immersion into an image using this science. This later inspired the invention of the View-Master, which was designed for “virtual tourism.”

    In the 1930s, Stanley G. Weinbaum would predict virtual reality in his science fiction short story, “Pygmalion’s Spectacles.”3 The story centers around a virtual reality system that uses goggles to broadcast a holographic recording of different experiences that involve all of the senses. In 1956, the first step towards virtual reality came to existence with the invention of the Sensorama.4 The Sensorama was invented by cinematographer Morton Heilig, who produced short films for the machine that immersed the viewer in the experience using a 3D display, vibrating seats, and smell generators. In the 1960s, Heilig followed the Sensorama with the invention of the Telesphere Mask, which was the first head-mounted display and featured stereoscopic 3D imagery and stereo sound.

    In 1961, Philco Corporation engineers created the Headsight, a head-mounted display as we know them today.5 This technology used a different video screen for each eye as well as a magnetic motion tracking system linked up to a closed circuit camera. It was designed to see dangerous situations from a distance for military purposes. As the user moved their head, the camera would move so they could look around the environment naturally. This was the first step towards the head-mounted displays we know today, though it was not integrated with a computer. This would come later, in 1968, when Ivan Sutherland with his student Bob Sproull created the first virtual reality head-mounted display that connected to a computer called the Sword of Damocles.6 This heavy device hung from the ceiling as no user could comfortably support the weight of the machine, and required being strapped into it. In 1969, computer artist Myron Kruegere developed a series of “artificial reality” experiences that were responsive.7 Projects GLOWFLOW, METAPLAY, and PSYCHIC SPACE ultimately led to VIDEOPLACE technology, which allowed people to communicate through this responsive virtual reality.

    In the 1980s, despite the fact that much technology had been developed in the field of virtual reality, there wasn’t actually a term for it. In 1987, the term “virtual reality” was coined by Jaron Lanier, who founded the Visual Programming Lab (VPL).8 Through VPL research, Lanier developed a series of virtual reality gadgets, including virtual reality goggles and gloves. These represented a giant leap forward for haptics technology, meaning touch interaction.9

    In 1991, virtual reality became publicly available through a series of arcade games, though they were still not available in homes. In these games, a player would wear VR goggles, which provided immersive stereoscopic 3D images. Some units even allowed for multi-player gaming. In 1992, the sci-fi movie “The Lawnmower Man” introduced the concept of virtual reality to the general public, with Pierce Brosnan playing a scientist who uses virtual reality to turn a man with an intellectual disability into a genius.10 Interest in virtual reality peaked, and in 1993, Sega announced that they would be releasing a VR headset for the Sega Genesis console, though this technology failed to develop and it was never actually released. In 1995, Nintendo also attempted to release a 3D gaming console, though it flopped due to how difficult it was to use and it was discontinued shortly after it was released. In 1999, the concept of virtual reality became mainstream with the film “The Matrix,” in which some characters live entirely in virtually created worlds; though previous films touched on the concept, it was “The Matrix” that had a major impact.

    In the 21st century, virtual reality technology has seen rapid development. As computer technology has evolved, prices have gone down, making virtual reality more accessible. With the rise of smartphones has come the HD displays and graphics capabilities necessary for lightweight, usable virtual reality devices. Today, technology such as camera sensors, motion controllers, and facial recognition are a part of daily technological tasks. Today, companies like Samsung and Google have started offering virtual reality through their smartphones, and videos game companies like PlayStation offer VR headsets for their games. The rising prevalence of virtual reality headsets has made this technology widely known. Given the strives VR technology has made in the last decade, the future of virtual reality offers fascinating possibilities.

    How it Works

    For the sake of simplicity, we will explain how virtual reality works through head-mounted displays, as this is the most widely known virtual reality technology. In most headsets, video is sent from a computer to the headset using an HDMI cable.11 They use either two feeds to one display or one LCD display per eye. Additionally, lens are placed between the pixels and the eye, which can sometimes be adjusted to the specific distance between the eyes. These lenses are used to focus the picture for the individual eye and create a stereoscopic 3D image using the technology that Wheatstone created centuries ago.

    VR head-mounted displays also immerse the user in the experience by increasing the field of view, meaning the width of the image.12 A 360-display is not necessary and too expensive, so most headsets use around a 100 or 110 degree field of view. For the picture to be effective, the frame rate must be a minimum of 60 frames per second, though most advanced headsets go beyond this, upwards of 100 frames per second.

    Another crucial aspect of VR technology is head tracking.13 Head tracking means that the picture in front of you moves with you as you move your head. The system used for head tracking is called 6DoF (six degrees of freedom) and it plots your head on a X,Y, and Z axis to measure all head movements. Some technology that may also be used include a gyroscope, magnetometer, and accelerometer, depending on the specific headset.

    Headphones are also used in VR headsets to increase immersion. In general, either binaural or 3D audio is used to give the user a sense of depth of sound, meaning it can sound like a sound is coming from the side, behind, or a distance from them.

    Currently, motion tracking technology is still being perfected in these VR headsets. This means that some technology uses motion sensors to track body movements, such as the Occulus Touch, which provides wireless controllers that allows you to use your hands perform actions in a game.

    Finally, eye tracking is the latest component to be added to certain VR headsets. In these, an infrared sensor monitors the user’s eye movements so that the program knows where you are looking in your virtual reality. This allows in-game characters to react to where your eyes are and it also makes the depth of field more realistic. Further development of this technology is also set to reduce motion sickness, as it will make it feel more realistic to your brain.

    With a greater understanding of this revolutionary technology, you can see how it can be useful in an infinite number of ways to a variety of different realms.

    Military/Defense Applications

    Virtual reality has already provided a lot of value to the military as one of the earliest motivations for this technology, with more possibilities on the horizon. Currently, virtual reality is being used to train soldiers for war.14 It is not hard to understand why the military leapt on this technology, as it allows a user to experience a dangerous environment without any actual danger to them. This makes military training not only safer, but more cost-effective in the long run, as real or physically simulated situations are quite expensive and can cause damage to costly equipment.15 Combat simulators are a common application of VR for the military, using headsets to give soldiers the illusion of being at war.16 This not only prepares them for the experience of war, it gives them a space in which they can practice using military technology with the ability to do it over again if they make a mistake. It also allows them to practice with each other within a virtual world, enhancing the communication of a unit.17 These virtual reality headsets also allow soldiers to prepare to make important decisions while in stressful situations.18 Given the demographics of army recruits in training (young adult men), this method of training is highly effective, as this group has grown up playing video games and finds this learning method appealing.19 Not only does virtual reality have applications for training soldiers, it may also be a helpful tool for helping them heal after combat; specifically, it may help treat PTSD.20 The idea is that virtual reality may allow soldiers to be exposed to potential triggers in a safe environment that allows them to process their symptoms and enables them to cope with new situations.

    In the future, the military will likely take advantage of further developments in VR technology by enhancing the realism of the simulators. It is likely that more humanitarian and peacekeeping training will be done through the use of VR. It is likely that facial recognition technology will be incorporated in order to assess a person’s emotional state, which may help enhance communication further both between soldiers and with interacting with people in foreign countries. Regardless of how this new technology is applied, it is certain that the military will be at the cutting edge of the latest VR technology.

    Commercial Applications

    Presently, the entertainment industry is next in line after the military to benefit the most from further development of virtual reality technology. Most obviously, the world of gaming has seen impressive (and not so impressive) advancements with VR headsets. Just a couple years ago, virtual technology through video games seemed unlikely to actually come to fruition. Today, the three most prominent VR game systems are the Oculus Rift, Playstation VR, and the HTC Vive.21 Each features games that allow the user to immerse themselves into an environment, whether it is a boxing ring, a truck, or Gotham. The future of VR in gaming will likely center around the development of better eye tracking and motion detecting within virtual reality. With these developments, video games will be more immersive than ever.

    Today, mobile phone companies are competing to create the most compelling VR device. Google recently released the Daydream View, a VR headset that is designed to be more comfortable and technologically advanced than its predecessor, Google Cardboard.22 Samsung has also recently released a comparable device called the Gear VR.23 Both of these devices allow the user to virtually visit anywhere in the world, use a series of apps, and also, as can be expected, play immersive games. As virtual reality technology becomes more prevalent, affordable, and usable, it is certain that more of these devices will saturate the market.

    Psychological Applications

    Finally, virtual reality has shown promise in the field of psychology. As mentioned above, potential has been shown for the use of VR for the treatment of PTSD. Beyond that, there is evidence to suggest that virtual reality could be applied to the clinical treatment of other anxiety disorders, such as phobias.24 Additionally, there is currently research being conducted in how virtual reality could help treat people with schizophrenia deal with their delusions and paranoia, allowing them to face their fears.25 Finally, virtual reality has the power to change how psychological research is performed entirely. With the use of VR, psychological researchers could have complete insight into the minds of certain people, giving them greater understanding of how to treat certain conditions.26

    The future of virtual reality is beyond anyone’s wildest imagination at the moment, but suffice it to say, it is safe to assume that the technology will only get more realistic from here. The potential applications for this technology are enormous in the military, the private sector, and the world of psychology, but other areas are set to benefit as well in ways we cannot anticipate. With time, virtual reality may be commonly available in everyone’s living room. Regardless of its specific future applications, virtual reality is set to change the world.

    Further Reading

    If you want to learn more about the fascinating technology behind VR or its applications, see the links below for further reading.

    The Future of Virtual Reality – TheNanoAge.com
    Virtual Reality in the Military: Present and Future – René ter Haar
    Everything You Need to Know Before Buying a VR Headset – Wired
    A Virtual Out-of-Body Experience Could Reduce Your Fear of Death – Seeker
    The Use of Virtual Reality in Psychology – Computational and Mathematical Methods in Science


    1. http://electronics.howstuffworks.com/gadgets/other-gadgets/virtual-reality8.htm
    2. https://www.vrs.org.uk/virtual-reality/history.html
    3. http://www.gutenberg.org/ebooks/22893
    4. https://www.wareable.com/wearable-tech/origins-of-virtual-reality-2535
    5. http://www.redorbit.com/reference/the-history-of-virtual-reality/
    6. https://www.freeflyvr.com/time-travel-through-virtual-reality/
    7. http://thedigitalage.pbworks.com/w/page/22039083/Myron%20Krueger
    8. http://www.jaronlanier.com/general.html
    9. https://www.vrs.org.uk/virtual-reality-gear/haptic/
    10. http://www.imdb.com/title/tt0104692/
    11. https://www.wareable.com/vr/how-does-vr-work-explained
    12. http://electronics.howstuffworks.com/gadgets/other-gadgets/virtual-reality.htm
    13. http://ftp.hitl.washington.edu/projects/knowledge_base/virtual-worlds/EVE/II.G.Military.html
    14. http://science.howstuffworks.com/virtual-military.htm
    15. https://www.geospatialworld.net/article/virtual-reality-trains-soldiers-for-the-real-war/
    16. http://fortune.com/2015/12/16/army-training-with-vr/
    17. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.76.3048&rep=rep1&type=pdf
    18. https://www.wareable.com/vr/how-vr-is-training-the-perfect-soldier-1757
    19. http://abcnews.go.com/Technology/treating-ptsd-virtual-reality-therapy-heal-trauma/story?id=38742665
    20. http://www.techradar.com/news/gaming/15-best-vr-games-best-virtual-reality-games-for-pc-and-mobile-1300576
    21. http://www.trustedreviews.com/google-daydream-view-review
    22. http://www.trustedreviews.com/news/samsung-takes-the-fight-to-daydream-vr-with-new-gear-vr-controller
    23. http://ieeexplore.ieee.org/document/1106906/?reload=true
    24. https://www.psychologytoday.com/blog/know-your-mind/201605/how-virtual-reality-could-transform-mental-health-treatment
    25. https://www.hindawi.com/journals/cmmm/2015/151702/
  2. The Fragility of China’s Economy

    In the current global landscape, China teeters on the edge of financial collapse.1 Between current foreign military pressures and their domestic economic insecurity, they remain in a precarious position, set to be crushed by these two opposing forces. Currently, foreign military issues can be traced to their expansion into the South China Sea.

    South China Sea

    Much of the foreign military pressure on China comes from their recent actions in the South China Sea. In recent years, China has acquired 90 percent of the South China Sea, creating artificial islands using dredging equipment.2 This is much to the distaste of their neighbors who have competing claims. In 2016, it was discovered that China was using these artificial islands for military purposes. While this is hardly a surprise, it does bring to question exactly how helpful these territories will be for their military strategy.

    Starting in 2009, China began their efforts to militarize the South China Sea by submitting a map to the UN with the (now infamous) Nine-Dash Line, boundary dashes across the South China Sea that declared it the territory of China. Since that time, China has expanded a minimum of seven different reefs and islets by using sand from the ocean floor, include Mischief Reef, Hughes Reef, Subi Reef, Johnson Reef, Fiery Cross Reef, Cuarteron Reef, and Gaven Reef. They have created more than 3,200 acres of land, according to the Asia Maritime Transparency Initiative.3

    At first, China claimed to be using the South China Sea for non-military purposes, such as humanitarian aid and scientific research, yet several of these islands are now home to airfields, antiaircraft/antimissile guns, and naval guns. For example, Cuarteron Reef has a High Frequency radar for detecting aircrafts, which is hard to claim for “peaceful purposes”. On Woody Island, they have installed HQ-9 long-range missiles.

    When considering former leader Hu Jintao’s movement towards “peaceful rise,” this clear territorial grab seems out of character.4 These actions have alienated their neighbors and brought in other powers such as India, Japan, and the US. A theory to rise out of these actions is that leadership decided that it was worth the potential diplomatic trouble to have a secure sea-based deterrent bastion.

    This echoes the Soviet Union’s actions during the Cold War, when their missile submarines were operated from “bastions” in the Barents Sea and Sea of Okhotsk. This made it so that Soviet missile submarines could be covered by land-based and naval forces in the event of enemy attacks.

    China’s land- and sea-based missile rely upon four Jin-class submarines. It is China’s belief that the US’s ballistic-missile defense are a threat to the credibility of their nuclear deterrent, which makes these protective bastions in the South China Sea much more important. Due to China’s geography, there is essentially one ocean for their bastion. The Northern Pacific is out due to the U.S. Navy and the Japan Maritime Self-Defense Force. In contrast, the South China Sea is bordered by countries that are no threat to China’s nuclear missile submarines.

    By installing a permanent presence in the South China Sea, China’s regional dominance is unquestionable. It also allows for China to install a permanent sensor system. Further construction will most certainly lead to complete defense against any submarine warfare against China’s sea-based nuclear weapons. They will also likely develop more surface-to-air weapons and land-based antiship missiles to, at the very least, protect their current military installations. Heavier defenses are being justified due to freedom-of-navigation operations the US and other allies.

    This shows the main weakness to China’s series of islands; there is only so long they can be defended. While ships can move, islands cannot. An island cannot be stocked with enough weaponry, food, water, and electricity to be a viable defense outpost. Islands’ inability to defend themselves is exemplified by Iwo Jima and Okinawa. Should a military confrontation arise between China and the US in the South China Sea, China’s at-sea outposts would have to be rolled back quickly in response to missile attacks and airstrikes, which would strand People’s Liberation Army Navy personnel on the island. The way China would respond to such an attack is an important question to be considered, as this victory could actually be a dangerous turn of events. While these islands offer strategic value, as a defensive solution, they are prone to destruction in the event of war. These military islands are a violation of China’s agreement to not militarize the sea, which is why it is of such concern to foreign militaries.5 This tension is putting more pressure on China’s economy.

    fragility_china_blog_innerimageDomestic Economic Health

    China’s economy is not only in peril due to foreign military pressures, but their own domestic economic well-being. Increasingly, central bankers fear a systemic collapse of China’s market. Their fears stand to reason: before the 2008 financial crisis, their debt-to-GDP ratio was 147 percent; now, it is at about 250 percent. Quietly, Chinese leadership has begun to lower growth expectations. While the official growth rate given was 7.5 percent, privately, the government believes that 6.5 percent is more realistic over the next year.6

    The methodology used to calculate this figure is not publicly known, but it uses economic data that can be manipulated to make it appear that they have more success with their economic plans than is true. What components are used to determine GDP are shifted between quarters, and the aggregate numbers are often built upon inaccurate or intentionally manipulated data.

    There are other metrics by which to analyze economic growth, such as through the ISM manufacturing index, which many countries use. The ISM can be somewhat unreliable, however, because the surveys upon which it is based may be inaccurate. Other indicators include coal, iron, cement, and copper inventories; though these can still be unreliable measures of growth. Recently, before taking his current position, Premier Li Keqiang suggested indicators such as electricity consumption and auto and excavator sales, indicating that this could be what China uses. However, these measures have actually recently seemed to suggest growth deceleration.

    Domestically, there is growing anxiety about China’s financial market as well. Local banks are failing to perform, with non-performing debt increasing. Non-bank financial institutions referred to as the “shadow banking system” are spreading, with little regulation or common understanding. There is so little regulation that meaningful data about these systems is difficult to come by. While two of these financial intermediaries have failed at this point, it’s hard to say how quickly or how many more may fail. While the government claims to be trying to better control this system, they seem to be having difficulty balancing regulation with the possibility of systemic collapse. There also seems to be a degree of corruption, with local government official tolerating and perhaps even encouraging sketchy practices by investors, such as hyperbolic assessment of assets and little documentation to prove their claims. Between this and short-term debts from foreign banks, the entire shadow banking system is threatened with collapse.

    Another Chinese economic issue is the real estate market. Commercial real estate bubbles are abundant, and residential real estate values have begun to fall. This threatens the possibility of social unrest, as many families begin to lose more and more money to this problem. As is typical with many countries, the Chinese government measures inflation by adjusting it downwards to conceal the impact it has on households.

    While China has been able to rapidly grow recently with global trade, the Lehman collapse plateaued world trade, hitting China where it hurt, as exports accounted for more than 40 percent of their GDP. While global trade did pick back up eventually, but more recently, as started to slow again, rising slower than world production rates when in the past, it grew twice as fast. With this slow growth, China is losing competitiveness in the market place. With the growth of the Chinese economy came the need for higher wages. In turn, China escalated their export products to a higher value, beginning to manufacture more technologically advanced and complicated products. In turn, quality suffered, which caused foreign markets to stray away from Chinese components. More and more, foreign manufacturers moved away from China in favor of other countries, such as Vietnam. China is no longer the place for overseas production; foreign companies only enter when they are targeting the domestic market. World trade growth continues to slow as the consequences of the 2008 financial crisis live on today. Therefore, China must focus more on building their domestic economy. With a shrinking population, the aged population with be responsible for more and more economic output. This will prove to be difficult, and only more so given the fractious nature of their government.

    A Fractious Regime

    From an outside perspective, the Chinese government seems unified and orderly. However, this is hardly the case. In the United States, we are aware of the competitive powers in place. In the two-party system, we often see tension between the people in power, particularly if we have a democratic president and a republican Congress or vice versa. Yet we often fail to recognize these tensions in foreign governments. In China, just as in the United States, different political groups hold different views on what policies should be put in place and how they should be enforced. This stands true even within the Politburo itself.7

    Currently, there are two separate factions within the Politburo that came out of Deng Tsao Ping’s death. These are the Shanghai bang, which is led by Jiang Zemin (former leader of China), and the Youth League Faction, led by Hu Jintao (another former leader). Xi Jinping, current leader, is of the Shangbai bang. However, once Jiang Zemin passes away, it could be that the Shangbai bang breaks into even more different segments. Two other political factors are the New Left (which emphasizes disciplined management) and the Princelings (sons of former Party leaders and their families).

    Bo Xilai, now disgraced, was a Princeling, and used this to create a network within the New Left as well as with military and state security leaders, as well as a political base within the Chongqing province.8 Currently, Xi Jinping is leading an effort to eliminate this power network by accusing those connected with it of corruption. Xi has reformed the Politburo, Central Committee, and Standing Committee as of this time. He has also started in on state security, going so far as to arrest former head Zhou Yongkang for corruption. Xi has also begun a reconfiguration of the PLA leadership, but this has proven to be tricky.

    The PLA is present in seven different military districts, the most power of which is the Shenyang district, which borders North Korea. The connection with the PLA, Bo Xilai, and Zhou Yongkang are quite strong in this region, and there are rumors that the generals are involved with North Korea. One way Xi has attempted to take greater power is by putting more of the military budget behind the Navy and Air Force, granting less money to the land-based military.

    Thus far, Xi’s leadership has not been tested to see how long it can be sustained. While efforts against corruption have been launched, it is not expected that all corruption will be removed. In reality, some parts of Chinese authority are likely to lose power, while others will gain more. Since Xi came to power, many wealthy Chinese people have anticipated moving their wealth abroad, as little has been known about how Xi’s power will come to affect them. This can be demonstrated by the elevated demand by Chinese residents for properties in cities all around the world, including New York, London, Toronto, and Sydney. Wealthy Chinese children are increasingly educated abroad and left to manage the family wealth overseas.

    fragility_china_blog_innerimage2Given the political turmoil, it is understandable why foreign central and private bankers are questioning whether the Chinese can handle reconfiguration of the financial market. The ability to make unified decisions is severely questioned by the fractured nature of the government. Bankers are apprehensive of China’s ability to navigate their bubble-ridden economy at the moment.

    At the moment, China’s number one goal is to prevent social unrest. There are greater efforts to unify China as a whole against their neighbors (in particular, Japan) so that its citizens feel loyal to their country against another. While China may be able to reform certain economic issues such as that of shadow banking, it is also likely that they will damage their foreign interests in the process.

    The fragility of China’s economy is important to understand given the current state of the global market. An understanding of the big-picture impact of the turmoil of foreign economies is one of the many ways our team of experts at Meraglim™ can help you understand what actions to take within the global market. If you need financial data analytics, we can help with our experienced staff and unique risk assessment software. Contact us today to learn more.


    1. http://www.tradingeconomics.com/china/gdp-growth-annual
    2. http://nationalinterest.org/blog/the-buzz/what-makes-chinas-fake-island-military-bases-the-south-china-19399
    3. https://amti.csis.org/long-term-strategy-scs/
    4. http://nationalinterest.org/feature/the-real-reason-chinas-massive-military-buildup-12502
    5. http://thediplomat.com/2017/01/the-civilization-of-chinas-military-presence-in-the-south-china-sea/
    6. http://www.sldinfo.com/chinese-economic-difficulties-exploiting-global-tensions/
    7. http://www.strategicstudiesinstitute.army.mil/pdffiles/PUB995.pdf
    8. https://muse.jhu.edu/book/1385
  3. The World-Changing Impact of Blockchain Technology

    In the last few years, you may have heard some buzz about blockchain technology. In a recent report from the World Economic Forum, it was estimated that by 2025, 10 percent of the GDP will be stored using blockchains or within related technology.1 With this in mind, it is imperative to understand this technology so you can take the jump on what will certainly be an abundance of opportunities.

    What is blockchain technology?

    Blockchain technology has been described as “the internet of value.”2 The internet is where we share information, and while we can transfer value (money) online, it requires intervention by financial institutions like banks. Currently, all online payment methods (like Paypal or Venmo) require connection to a bank account or credit card. Blockchain technology is intriguing because it offers the potential to eliminate the need to involve banks in transactions. Blockchain technology records transactions, confirms identity, and arranges contracts, all of which previously required a financial institution. Currently, blockchains, also known as “distributed ledgers” or “digital ledgers”3, are used to keep track of economic transactions of bitcoin and other cryptocurrencies; however, this technology has the potential to revolutionize far more than financial services.

    Bitcoin and other cryptocurrencies

    The blockchain was invented by a person (or group) who goes by the pseudonym “Satoshi Nakamoto,” the creator of bitcoin.4 Bitcoin are a kind of digital currency that is exchanged directly between two people in a transaction; no bank is necessary as an intermediary.5 Bitcoin was invented in response to the 2008 financial crisis; the mysterious Nakamoto, whose real identity has not been established, published an essay outlining the problems of the traditional fiat currency and presented bitcoin as an alternative.6 When it was first released, bitcoin excited people because it offered the possibility to escape the credit bubble cycle that is a staple of traditional currency. However, financial institutions also keep track of every transaction to ensure that no dollar is spent twice, and clearly, with paper currency, you can’t keep reusing the same bill over and over. With digital currency, there was the potential issue of someone using the same bitcoin again and again. Nakamoto created blockchains to combat this issue. This innovative cryptography is so advanced that it has proven impossible to attack, leading many to believe that either Nakamoto is a complete genius or is the pseudonym for a team of advanced programmers and economists. However, it is unlikely that the true identity of this brilliant innovator will be publicly known anytime soon; after all, it makes sense to hide when it comes to experimenting with currency too publicly. After all, when Hawaiian resident Bernard von NotHaus produced and sold “Liberty Dollars” in 2009 he was arrested and charged for breaking federal law. Nakamoto’s anonymity allows him (or her, or them) to provide this astounding digital currency to the world without repercussion.

    How it works

    In the context of bitcoin, the blockchain serves as a database that holds the payment history of every single bitcoin, serving as proof of ownership.7 The blockchain is then broadcast to a network of thousands of computers, which are known as “nodes.” These nodes are all over the globe and publicly available. Despite how open it is, it is also incredibly secure and trustworthy. How is that possible? Through its “consensus mechanism.” This is how nodes work in tandem to update the blockchain in response to transfers from one person to another.

    For example: Jill wishes to use bitcoin to pay Bill for his services. Jill and Bill both have their own bitcoin wallets, which is software that is used to store bitcoin by accessing the blockchain without identifying a user to this system. Jill’s wallet communicates with the blockchain, asking that her wallet loses bitcoin and Bill’s gains them. To confirm this, there are a number of steps the blockchain must go through. Upon receiving the proposal, the nodes work to check whether Jill has the bitcoin necessary to make this transaction. If she does, a specialized group of nodes called miners combine this proposal with other similar transactions, creating a new block for the blockchain. To do this, miners must feed data through a “hash” function, which simplifies the block into a string of digits of a certain length. This is a one-way process: while it’s simple for data to go to hash, hash cannot go back to data. While hash does not host this data, it is entirely unique to it. If a block is changed in any way, whether entirely or by a single digit, a different hash will result.

    The hash is then put into the header of the block. This header is used for a mathematical puzzle that again uses the hash function. This puzzle can only be solved using trial and error. Miners go through the trillions of possibilities to look for the answer to this puzzle. Once a miner discovers this solution, it is checked by other nodes (while solving takes time, checking is a simple process), and the solution is confirmed and updates the blockchain. The header’s hash becomes the new identifying string of the block, and it is officially part of the blockchain. Jill’s payment to Bill is confirmed and reflected in their bitcoin wallets.

    theuseofdataanalyticsinfinancialmarkets-cta

    This method introduces three factors that ensure the security of bitcoin. The first is chance. There is no way to predict which miner will find the solution to the puzzle, so it is impossible to determine who will update the blockchain; this makes it difficult to trick the system. Next, the extensive history within the blockchain serves as security. Within each header, there the hash of the previous header, which contains hash from the one before that, and so it goes on to the very beginning. This is what composes the blocks of the blockchain. Therefore, making any change in any of the headers, even back to the earliest blocks, changes the subsequent headers. As the blockchain no longer matches the latest block, it will not be accepted.

    Is there any way to cheat the system? Technically, but it is highly unlikely. Say Jill decides she wants to rewrite the history so that instead of the bitcoin goes to Bill, they actually stay in her wallet. If she knew how to mine well, she could potentially solve the puzzle and produce a new blockchain. However, in the time it took her, the rest of the nodes would have added more headers to the blockchain, lengthening it, because nodes always work on the longest version of the blockchain. This is to stop issues from occurring when two miners find the solution at the same time; with this measure, it just causes a temporary fork. This also prevents Jill from cheating the system. In order to get the system to accept her version, Jill would have to lengthen the blockchain faster than the rest of the system is working on the original. In order to do so, she would have to have control over more than half of the computers, making cheating pretty much impossible.

    The final way the security of bitcoin is ensured is through incentives for the nodes. When a new block is forged, it makes new bitcoin. The miner who solves the puzzle earns 25 bitcoin, which currently is worth roughly 7,500 dollars.

    However, as clever as this system is, bitcoin is still not an extremely attractive currency. Its value is unstable and the amount currently in circulated has been intentionally limited. However, the blockchain technology functions so well, it has created a lot of buzz about its potential uses outside of bitcoin. Clearly, there is great potential for this technology to disrupt the financial services industry. Blockchains will likely help improve existing processes, making them more secure, inexpensive, and efficient. Additionally, new products that are beyond what we can even conceive of right now will be invented, turning financial institutions on their heads. However, the applications of this technology go well beyond the world of banking.

    Defense applications

    In the world of defense, blockchains show promise due to their incredible security. Currently, the Defense Advanced Research Projects Agency (DARPA) is looking into ways to use blockchain technology to secure military systems and ensure safe storage of nuclear weapons, among other potential applications.8 Because blockchains are near impossible to hack, the military is interested in using this incredible technology to maintain the integrity of highly sensitive data, and has contracted computer security company Galois to verify a blockchain technology created by Guardtime.9 If the project goes well, blockchain technology could soon begin to be implemented into military technology. What is particularly attractive about blockchain is not only that it is nearly impenetrable, even if a hacker were to enter into a security military network, they would be unable to make any damaging changes to the code, as only authorized users can.10 This is ideal for military use as it would prevent anyone from being able to hack in and gain control over military satellites or nuclear weapons. Today, even if a hacker couldn’t gain direct control over a weapon, they could interfere with military communication without being noticed. This is why they are particularly interested in using blockchain technology to develop a new messaging platform that would allow for completely secure communications.11

    Commercial applications

    Blockchains have clear applications for the financial services industry and the military, but it can also be used to enhance the experience of consumers. The widespread use of blockchain has the potential to enable a shared economy.12 A movement towards this can be seen through companies AirBnB and Lyft, but by enabling peer-to-peer transactions on a wider scale, blockchain technology could create a sharing economy that doesn’t require a middleman (and therefore, transaction fees). Consumers could also benefit from blockchain technology because they could have greater access to information about what exactly goes into their products. More and more, consumers want to verify claims companies make about their products, and through the transparency that blockchains create, it would be far easier to either verify or disprove lofty claims. This would mean that reputation would be more important than ever for businesses. Additionally, people will be able to feel more comfortable using the internet for financial transactions, as blockchains make identity management quite simple; by being able to verify identity online, both the business and the consumer can trust the transaction. This is truly only the tip of the iceberg when it comes to commercial applications of this technology.

    Government applications

    One place people are started to buzz about blockchains is in the world of governance.13 Blockchain could usher in an era where voter fraud and government corruption could be exposed through code. Traditional voting systems would have to be altered to be online, which would then ensure more transparency because it would hold the voting system accountable. Additionally, the extensive history that blockchains provide would prevent outright lies from being spewed by politicians, as there would be hard data to the contrary that everyone could accept; the public would be more intimately knowledgeable about the truth because the blockchains could serve as a built-in lie detector. It could even come to be that the decision-making process is streamlined through code, meaning necessary changes in law could occur at a much more accelerated pace.

    Blockchain technology is set to revolutionize financial institutions, the military, the private sector, and the world. The potential uses of this technology are coming to light more each day as more industries become aware of the security and reliability of this technology. Though initially created for bitcoin, whichhas faced controversy and may not stand the test of time, blockchain technology has the potential to change the entire world.

    Further Reading

    Want to learn more about blockchain technology? Read further with these links below.

    What is Blockchain Technology? – Blockgeeks
    5 Ways to Invest in the Blockchain Boom – Investopedia
    The Great Chain of Being Sure About Things – The Economist
    Bitcoin Blockchain Technology In Financial Services: How The Disruption Will Play Out – Forbes
    Block Chain 2.0: The Renaissance of Money – Wired


    1. http://www3.weforum.org/docs/WEF_GAC15_Technological_Tipping_Points_report_2015.pdf#page=24
    2. http://www.forbes.com/sites/bernardmarr/2016/05/27/how-blockchain-technology-could-change-the-world/#70108f1c49e0
    3. http://www.blockchaintechnologies.com/blockchain-definition
    4. http://blockgeeks.com/guides/what-is-blockchain-technology/
    5. https://bitcoin.org/bitcoin.pdf
    6. http://www.newyorker.com/magazine/2011/10/10/the-crypto-currency
    7. http://www.economist.com/news/briefing/21677228-technology-behind-bitcoin-lets-people-who-do-not-know-or-trust-each-other-build-dependable
    8. https://www.deepdotweb.com/2016/10/20/blockchain-technology-may-borrowed-darpa-secure-military-networks/
    9. https://qz.com/801640/darpa-blockchain-a-blockchain-from-guardtime-is-being-verified-by-galois-under-a-government-contract/
    10. http://www.popularmechanics.com/military/research/a23336/the-pentagon-wants-to-use-bitcoin-technology-to-guard-nuclear-weapons/
    11. https://bitcoinmagazine.com/articles/darpa-nato-looking-at-military-applications-of-blockchain-technology-1464018766/
    12. http://blockgeeks.com/guides/what-is-blockchain-technology/
    13. http://blockgeeks.com/blockchain-voting/
  4. How to Invest in Gold

    dreamstime_xxl_18339679Gold is the categorical dollar hedge investment. To counter the falling dollar, the best course of action is to invest in gold, whether you invest directly in the metal itself, purchase gold mining stock, or choose mutual funds. As the dollar falls, gold rises. The American blue chip industry cannot compete in the global economy anymore. Gold will be the key to growth. Even if we do not return to the gold standard, the value of gold has always been high (as you can see by reading our History of Gold post), and this will not change. Therefore, it is in your best interest to invest in gold now. Fortunately, there are several different ways you can invest in gold, which we will outline below.1

    Direct Ownership

    The gold bullion is pure value. This can be seen by man’s relationship with gold throughout the centuries. This can be demonstrated by ancient Egyptian civilizations that buried mountains of gold with their pharaohs as they believe it would be necessary to use in the afterlife. Gold has started many wars, and we can see people’s desperation for this precious resource by the conditions endured during the Gold Rushes. This stands to reason, as gold is far more valuable than paper money could ever be. Gold cannot be controlled by governments the way paper money can; hence why governments go off the gold standard.

    The value of gold rises based on the simple economic principle of supply and demand, regardless of policy around interest rates or the manipulation of paper money. The one significant disadvantage of gold is that there is wide spread between bid and ask prices, so you cannot expect a quick profit. You will buy it at retail price and sell it at a wholesale price, so the jump between these needs to be significantly even just to break even. That is why gold should not be seen as speculative asset; instead, consider it a defense asset to hold value. As your dollars will fall in value, gold allows you to preserve value. To own gold directly, it best to invest in minted coins, such as Canadian Maple Leafs, American Eagles, or South African Krugerrands.

    dreamstime_l_27669109Gold Exchange Funds

    Another way to invest in gold is a gold exchange traded fund (ETF). ETFs are kind of mutual fund that trade on the stock exchange; the portfolio is fixed and won’t change. So gold ETFs are composed solely of gold bullions as the asset. There are two ETFs in the U.S.; “GLD” (SPDR Gold Trust) and “IAU” (the iShares COMEX Gold Trust). Either is a sensible option for holding gold.

    Gold Mutual Funds

    Gold mutual funds are a sensible alternative for holding gold bullions for those who hesitate to invest in physical gold. Gold mutual funds hold stocks in companies that mine for gold. An example of this is Newmont Mining (NEM), a well-capitalized company with a good track record for making a profit. This is an example of a senior gold stock, which is invested in a company that owns well-established mines that produce a fair amount of gold every year. This is a more conservative play than investing in newer gold companies.

    Junior Gold Stocks

    Junior gold stocks are more uncertain than senior gold stocks. They are in companies that have less productive mines or may be exploring, which can lead toa large profit, but presents a greater risk. These stocks are best for investors with a wider risk tolerance, and who can accept potential losses for possible major gains.

    Gold Options and Futures

    Options may be the right move for seasoned investors. They allow you to speculate gold prices. In the options market, you can make guesses for movements in either direction. Buying a call means you will hope prices rise. A call fixes the purchase price; the higher that price goes, the larger the margin between your option price and the current market price. On the other hand, when you purchase a put, you anticipate the price falling.

    Buying options is a risky way to invest; most people fail. About 75 percent of all options bought are worthless. This is a complex market that requires vast knowledge. There are two primary traits of an option, one that is positive and one that is negative. The positive trait is that investors are able to control their large investment using a small amount of money. The negative trait is that options expire after a certain amount of time. As the expiration date comes closer, the time value of the option vanishes. For the majority of investors, the futures market is out of their league. Even experienced investors see how high risk the futures market is. Large profits can be made, but can be lost almost instantly.

    We don’t know when the dollar will collapse, or how long it will take. We just know that it will happen. After years of mismanaged monetary policy by the Federal Reserve, there is no question of its inevitability. The removal of the U.S. dollar from the gold standard has had a long-term impact. Nixon saw this as a way to solve the economic issues of the time, but as a result, we see ever-growing debt, trade deficits, and endless money-printing creating a credit-centered economy. Taking a broad view of the global economic market, an investor can see that problems are inevitable. We have delayed this trouble slightly due to China’s parallel economic troubles.

    China has built debt upon the troubled U.S. dollar, bringing with it other Asian economies. The fall of the dollar will be a major issue for not just the U.S., but many other countries with it. To offset this, commodities are a logical choice, such as oil, and yes, of course, gold.

    This exemplifies the ironically predictable pattern of monetary policy. Governments will overprint money, destroying their currency. Then, they will go back to gold, going through much expense and suffering. At this moment, we are at the precipice of another collapse, as poorly planned monetary policy fails us. However, we do not have to wait for the dollar to collapse. Respond today. By being proactive now, investors can anticipate this collapse by investing in tangible assets that will hold value regardless of what happens to the U.S. monetary system.

    Currency is not valuable in itself. It is essentially an IOU that today holds no real value. Once the national credit limit is reached, monetary policy will have to be changed, and investors will lose. Investors who hold stock in goods with a tangible value will be the ones who benefit. The key is starting from a good position before this collapse.

  5. What We Can Learn from Market Bubble Bursts

    Just like the bubbles children blow on playgrounds, economic bubbles all eventually burst. However, people often have a short memory when it comes to these bubbles because it is difficult to see a bubble for what it is in real time. When we look back on price charts later, it is easy to identify bubbles, but when the bubble starts to inflate, we only see one piece of the puzzle. Markets are a direct reflection of people’s opinions. When people feel the market is healthy, it functions well because people put more money into it. As people’s good opinions of the market grow, the bubble grows with them, and this is where the trouble comes in; the opinions are skewed. Investors make more impulsive decisions, while they more level-headed investors become overwhelmed by their overly enthusiastic counterparts.

    People Become Irrational

    iamge-1A common definition of a bubble is when prices rise above a logical valuation of the asset. Naturally, this is subjective; while one person might think it is set at a fair price, another would think it completed overpriced. The bubble involves more than just more “overpriced” valuations; it also involves people abandoning logic that they held before the bubble. In a stock market bubble, this may present itself in the form of new valuation metrics that are justified with claims that an industry is “different” than others. This arguments are sometimes valid, but often they are not. This is a good warning to you to be diligent before investing. People will make claims that this new asset is so radically different than anything before it that old rules do not apply. However, this thought process does not incorporate the full picture.

    While a new industry may bring some change, there are certain things that never change, like human nature. Many aspects of our human nature come together to make bubbles possible. For example, human beings are not great at judging scale, so while we may be able to judge that a narrative may justify a price rising, we are not great at justifying just how high it goes. Additionally, bubbles are caused by confirmation bias; with every piece of news that supports the new valuation metrics reinforces the group mentality of this being radically different. Any negative stories are brushed aside because they do not confirm what they wish to be true. A true sign of an impending bubble is when you see story after story emerge before they can be challenged; this is a clear indication that the market is being dominated by emotion rather than logic.

    Participation Falls

    As prices rise, the risk for betting against these prices get higher and higher, and doubters retreat to the sidelines. This makes the only participants in the market the people who are jacking up prices. Eventually, this exponential rise must shift as people begin to sell and the price falls. Then, the doubters come back, and the cycle continues. Bubbles occur when the market skews towards investors that are too positive about an asset. This is not just a theory; it can be seen in three different bubbles that have occurred around three assets in the 21st century: gold, oil, and the internet.

    Gold in 2011

    image-2In 2000, gold traded at about 300 dollars an ounce. At the end of 2010, this had soared to 1,300 dollars. This figure was steadily climbing throughout 2011, reaching a peak on September 6th of $1,921 an ounce. People just continued to buy and buy as the price rose out of an emotional response; they needed to invest in gold because it was clearly becoming more and more valuable, right? However, rather rapidly after, prices again began to fall. By the end of that year, it was back down to 1,600 dollars, and continued to bounce around until April 2013 when it crashed to 1,200 dollars. The bubble burst and gold investors took a hit.

    This is not to dissuade investors from gold. Gold can be volatile, but with the right approach, it is an essential investment. Gold is a long-term investment and an important asset to have in the event of a financial crisis. Gold is tricky to value as its price moves up and down based on the current political and economic climate. However, when the market comes crashing down around us all, based on history, we can expect gold to skyrocket. Gold is the best fallback for a financial crisis, but you cannot expect to instantly make money on it.

    Oil in 2008

    By mid-2008, it was clear that stocks were on the way down, and many investors feared the worst (rightly so!). Others responded by throwing all of their efforts into investing in oil. The price of oil started to rise quickly, just as gold would three years later. The difference is that the demise of oil took a much shorter amount of time, and fell far more dramatically. After peaking at 147 dollars a barrel on July 11, 2008, it started to drop; by December, it was just about 30 dollars. This obviously had devastating consequences for oil investors.

    Of course, much of this was due to the United States’ recession, which took most of the developed world with it. However, the demise of oil could have been predicted had investors not bought into the narrative of it being the solution. People saw that demand for oil was high and supply was low, which seemed to be a clear indication that it was time to buy stock in oil, but there were clear indications that a bubble was forming. One indication of a bubble is that the market only listens to one narrative, ignoring the negative side of the story. Investors saw the high demand for oil, but refused to see the impending recession.

    When it comes to investing in oil, it’s important to keep in mind both sides of the equation. In 2008, people saw the incredible demand for oil as a clear indication that they could make a lot of money off of it; however, this was a singular way of looking at it. Once the recession hit, demand slumped, and oil investors got hit bad. The lesson here is that these variables can change drastically and quickly against you. If you are going to invest in oil, it’s wise to be selective about what companies in which you buy stock. While oil prices will vary and they will be affected by it regardless, a company with a good business and smart management will weather the unpredictability better than a company without those essentials.

    Again, oil serves as a reminder that it is smartest to invest with the long term in mind. This makes you less likely to jump on a trending asset just because it looks like you will make an immediate profit. It also means you can weather the variability of the market better and wait out any undervalued asset, knowing that it will inevitably rise again.

    At the moment, oil seems to be volatile for a while. This is due to shale. Starting in 2011, the United States began producing more and more crude oil due to fracking, the production through which oil and gas deposits are withdrawn from shale rock. This not only increased the supply of oil, but took away Saudi Arabia’s ability to set the oil price for the world. With shale, the production of oil could respond to the rising price. This development is a significant change in the oil industry, as previously, it was believed that it would take years for more supplies of oil to be made, and it would cost billions of dollars. Currently, shale oil suppliers have not stopped production as prices have fallen; in fact, in Saudi Arabia, production has increased.

    The Dot Com Crash of 2000

    At the beginning of 1997, the Nasdaq Composite, a stock market index weighted towards tech stocks, was at 1,291. By 1999, it had tripled. It peaked at 5,132 in March 2000. For the rest of 2000, however, it slowly fell. By 2001, it was trading under 2,500. By October of the following year, it was at 1,108, falling below the pre-bubble level. How could this happen? Again, people fell victim of listening to only one side of the story. The internet was set to change everything; we were no longer restricted by geography, and technology was set to serve millions of consumers. Investors anticipated gigantic profits.

    This story wasn’t completely wrong; the internet has delivered many of the benefits promised to us. However, investors took these promises and invested ten-fold. While some of the internet companies, such as Amazon, did deliver the anticipated returns for the people who stayed with them, there was a lot of variability during this course. People who bought Amazon stocks for 100 dollars a share then have seen it trade at four times that; however, these people had to stick through 2001, when it dropped to seven dollars a share. For every Amazon, there are a dozen or more companies that never got off the ground.

    So how do you avoid getting burned by tech stocks? Don’t buy them. This is not to say that we advise you not to buy them; rather, risk is inherent to investing and trying to avoid risk is a waste of your time. Instead, keep in mind how much you are willing to risk because it is possible that nothing will come of your tech investment. Consider whether the buzz around a company has raised the price of shares higher than their true value. Also consider how much of their growth you will be able to benefit from yourself; does the company have a competitive advantage in the market? How much of the money they acquire from growth is going to have to spent to keep up with the competition? Finally, consider how long you are willing to wait to turn a profit. Take the example of Amazon; can you wait the decade it took for them to become a profitable investment?
    With years of experience in industries such as defense, finance, intelligence, law, and the private sector, our team at Meraglim™ has keen insight into market moves and can provide your team with accurate predictions of future bubbles and their impending bursts. When you need financial data analytics, take advantage of our team of experts paired with innovative risk assessment software. Contact Meraglim™ today to learn more about how we can help you.

  6. 2017 U.S. Economic Trends

    Though the financial crisis of 2008 is now nearly a decade behind us, we still see its impact in the global and U.S. economies alike today. As we begin 2017, we reflect on the economic trends of the recovery period of the last seven years: slow and underwhelming growth, low labor market participation, and high cost of urban living. Now that “recovery” is over, things have returned mostly to normal, and growth peeters out, economic growth will be significantly more difficult to achieve. As we face the new year, we can anticipate certain trends based on past few years.

    Back to “Normal”

    At the end of 2016, economic growth has returned to its normal level. Since September 2015, unemployment has not been over 5 percent, and reaching a low of 4.6 percent in November 2016. GDP has also been steadily growing, but not impressively; when the figure is adjusted for inflation and population growth, from the third quarter of 2015 to that of 2016, GDP grew 0.8 percent. Private domestic investment grew to 17 percent during the recovery period, but has plateaued since then. With 16 to 17 percent GDP, investment is just barely treading water. Increasing domestic investment will be a high priority for policymakers.

    Additionally, labor market participation has not recovered from the recession. This can partially be accounted for due to baby boomer retirement: those who were born in 1951 turned 65 last year. However, the younger population is also less likely to either be working or looking for work when compared to workers before the financial crisis. One of the main challenges for policymakers will be this low participation in the labor force.

    The one exception to the relative normalcy the economy has returned to in monetary policy. For years, inflation remains below the two percent target set out by the Fed. As a result, the Fed has left their policies in recovery positions, where they will likely remain until inflation is two percent.

    Output

    For a recovered economy, the GDP has been reasonable; however, it is far below what was expected before the recession. In previous recessions and depressions, GDP bounced back to what it had been previously. For example, after the Great Depression, incomes returned to normal from before the crash, and the economy continued to thrive for the next two decades. In the current economic environment, however, growth is lower and slower than this trend. The recovery from this recession has been disappointing because it has been slower than anticipated, and it did not return to normal in line with the historical precedent.

    At the beginning of 2009, the CBO predicted that the economy would grow at more than 3.5 percent a year for four years starting in 2011, and would produce $20 trillion in 2016. In 2016, output actually was below $19 trillion. This equates to a 7.5 percent income loss for investors and workers in the U.S.
    As a result, the BCO has down downgraded expectations for the future of the U.S. economy. In 2017, it anticipates production to be $19.4 trillion, as opposed to the $21 trillion they predicted in 2009. In 2009, they predicted the business sector would grow 24 percent before 2017; now, they estimate 14 percent growth.

    This lack of growth can be attributed to the combination of three factors: low productivity, low capital, and low labor. The lack of productivity growth is a direct explanation of this output loss. Indirectly, it also explains that lower productivity means lower ROI, which reduces output as investors shy away.

    Wage Growth

    Though economic growth has been underwhelming, wages have picked up. However, this is lower than anticipated, and wage growth in 2015 to 2016 can be partially attributed to energy prices dropping.

    General average compensation is mostly determined by labor productivity. The link between compensation growth and labor productivity has been established for decades. Labor productivity is determined by the amount of capital production per worker and the total factor productivity. Total factor productivity is a combination of technology, regulatory waste, market flexibility, and management practices.
    Potential labor force productivity estimates by the CBO suggests that certain factors have stifled wage growth: low productivity growth and low investment. In 2009, the potential of labor productivity was predicted to grow by 18 percent by this time. This estimate has now been dropped to 10 percent.

    In 2017, policymakers must focus on raising wage growth by focusing on how to increase labor productivity. This can be done two ways. One, promote investment production cutting the marginal tax rate on new investments. Additionally, reforming regulations can help by lowering the number of anti-competition policies because these discourage investors. Alternatively, policymakers can boost wage growth by increasing total factor productivity growth; however, this is easier said than done. To get productivity even back to before the financial crisis would be a feat, but there is no single action that can be taken to increase productivity.

    Productivity is largely determined outside of policy. However, the most promising action that policymakers can take is regulatory reform. The issue is that there are few significant enough regulations that could improve the economy through change alone. It must be done over multiple sectors and through all levels of government.

    meraglim_2017economictrends_blog_innerimage2The Labor Market

    There are fewer Americans working today than there were before the recession. This is true even when excluding retired people or students. However, labor force participation did grow after the growth of real wages in 2015. In 2017, the optimistic view is that wage growth will continue and more non-working people will get off the bench and start contributing to the economy.

    Much of what has transformed in the economy is due to two factors: changes in traditional family dynamics, and the implementation of certain public assistance programs. The traditional nuclear family of the U.S. up until as recently as 1980 has depicted men as the breadwinners of the family. Both the fact that male-dominated professions were higher income and the impact of the social expectation contributed to the strength of men in the labor force. Men were forced to stay at their jobs, even if wages fell, because it was required for their family.

    Now, women work much more than they ever have, and many households are headed by females. There are now more single moms than ever, or women as the sole breadwinners of the family. There are also substantially more two-income households that come with lifestyles that require both spouses to work. Now, fewer men are committed to the labor force. Without a high enough wage, they have other options, such as living off their spouse or partner.

    There are also an increased number of people living on disability insurance, permanently detaching from the productive economy. Therefore, in 2017, welfare reform must be a high priority for policymakers, particularly in regards to disability insurance. As it is, welfare programs penalize working; easing these restrictions means that more people could start to work, supplement with welfare, and slowly transition off of it. Additionally, housing assistance discourages marriage; eliminating penalties for marriage may encourage people to combine incomes, enabling them to get off of these social programs.

    meraglim_2017economictrends_blog_innerimageIncreased Cost of Living

    The value of workers’ wages is dependent on the prices of their purchases. With increased regulation and restrictions on trade, U.S. prices have elevated. These policies could stifle the economic growth of 2017.

    On a federal level, increasing tariffs on energy, a proposed solution, would raise consumer costs, again stifling the benefits of economic growth for Americans. On the state and local levels, there are also regulations in place that are set to continue to raise prices. For example, in 2016, New York City places limits on short-term home rentals. These regulations lower incomes, increase costs, and decrease flexibility in the market.

    On both coasts, limits of residential construction are increasing the cost of living. In Silicon Valley, the ever-growing technology sector is not having the economic growth impact it should due to regulation on construction and urban growth in the area. Historically, when a thriving industry is in a certain city, it translates into shared prosperity. Now, local laws that limit development and population density prevent the economic growth of these cities. Working-class renters are in turn priced out of the housing market and forced to move. Yes, these regulations have other benefits that can be widely shared. However, these benefits are often narrowly focused, only aiding in the constituencies of those who have power and oppose reform.

    Living standards are heavily impacted by these regulations. In 2015, the Heritage Foundation published a report that found that the average household was set back $4,440 per year due to just 12 regulations. Though there has been little improvement since then, Congress did repeal the crude oil exports ban, which brought that list of regulations down to 11. In order to foster economic growth in 2017, policymakers at all levels need to prioritize deregulating consumer markets; most important, housing, transportation, and energy.

    Conclusion

    In conclusion, while the U.S. economy did recover from the recession, the results have been underwhelming. There are fewer jobs, fewer workers, lower incomes, and higher prices than anticipated due to history. Today, people are more content to not contribute to the labor force and to not invest, leading to dragging economic growth.

    Since the 2008 financial crisis, the fixes put into place by policy makers have failed, including stimulus spending, bailouts, large deficits, and financial market regulation. These attempts have not increased investment or economic participation in general, and the cost of living question has not been addressed for the majority of the country.

    Going forward in 2017, it is imperative for policy makers to increase investment incentives, reengage non-workers into the economy, and lower the cost of living. With the right policies and the united efforts of investors, workers, and inventors, income growth can be restored to pre-recession levels, and the average household income will rise. Additionally, regulatory reform aimed at lowering the cost of living may also increase income for the average American.

    At Meraglim™, our unique combination of risk assessment software and a panel of experts allows you to predict financial market moves before anyone else. If you want to learn more about how our team can help yours, contact us today.

  7. The Use of Data Analytics in Financial Markets

    At Meraglim™, we take a multidimensional approach to our product. Not only do we give our clients access to our panel of prominent experts from a variety of fields, we also use an AI analytic engine to provide data analytics as a service (DAaS). More and more, companies are contracting companies to provide DAaS, as the valuable information this provides can be revolutionary for business. With the emerging technologies breaking into the market today, it’s imperative to implement the analytic tools at your disposal or risk falling behind the curve. Recently, our partner IBM collaborated with Saïd Business School at the University of Oxford to look into how banks and financial markets organizations are using data analytics to change the industry in “Analytics: The real-world use of big data in financial services.” In this blog, we will review their findings and how data can serve your team.

    The Importance of Data in Financial Markets

    For banks and financial services companies, there isn’t a physical product for them to offer. Supplying information is their trade, and data is an important resource for providing quantifiable support for their services. Within the financial services industry, there is endless data to be mined from the millions of transactions performed on a daily basis. The important advantage analyzing this data provides to financial institutions is evident; IBM and Saïd found that 71 percent of financial markets firms report that they have developed a competitive advantage by using financial data analytics. When comparing this statistic with the respondents to a similar research study by IBM two years prior, it increased by 97 percent. While banking data has grown to provide more valuable information, in our technologically advanced world, people are now banking and managing finances in a variety of ways. This unstructured data has important promise for reading into customer’s insight. This detailed information can guide investors, financial advisors, and bankers to making the best decisions for their customer base while staying compliant with regulatory laws. Companies have successfully used this data to identify business requirements and leverage the current infrastructure accordingly.

    theuseofdataanalyticsinfinancialmarkets-cta

    Big Data Movements Today

    Most financial organizations today recognize the importance of big data, and are slowly implementing plans on how to use it. The majority are either currently developing a big data plan (47 percent) or are already implementing big data pilots (27 percent). In their study, IBM and Saïd found four findings that demonstrated how these companies are using big data.

    The customer is king

    More than half of industry respondents identified customer-driven goals to be their priority for big data. This stands to reason as more and more, banks are facing pressure to be customer-centric. Financial institutions must keep the customer in mind when designing their technology, operations, systems, and data analytics. Data analytics is an important tool because it enables companies to anticipate changes in the market and customer preferences to quickly take advantage of any opportunities that present themselves.

    Companies need a scalable big data model

    The research also found that the most important consideration companies must make when creating a big data model is that it must be able to accommodate the ever-growing amount of information from different sources. In a survey of these financial institutions, though only half of companies reporting said that they integrated information, IBM found that roughly 87 percent of respondents reported having the infrastructure that was necessary to accommodate the addition of more information.

    Integrating data across departments and areas has been a challenge to businesses for many years now, particularly in respect to banks due to the sheer amount of data that comes into play. This complex part of integrating big data is an essential component. It most often requires the integration of new analyzing software components, such as NoSQL and Hadoop. However, the financial industry is falling behind in this respect.

    Efforts are focused on existing sources of data

    When looking at what financial institutions and banks are doing in terms of big data efforts, the majority are focused on using the data sources they already have internally. This makes sense, because, while big data has clear and important implications for the future of these companies, they want to take a cautionary approach rather than trying to find brand new data and risking it being useless. It also speaks to practicality, as there are many uses for the internal data of these companies that is as of yet untapped.

    Most commonly, respondents to this survey were analyzing log and transactions data. Every transaction and automated function of the bank or other information system is used, which cannot be analyzed by traditional means anymore. As a result, there is years and years of data that has yet to be analyzed by these institutions. Today, the technology finally enables this information to be used, though someone with the analytical skills is also necessary.

    Banks and financial markets could catch up to their peers in terms of analyzing more varied types of data. Roughly 20 percent of respondents analyzed audio data, and about 27 percent analyzed social media. A lack of focus in unstructured data could be disrupting their ability to do better in these terms.

    Analytical ability is important

    While data in and of itself plays an important role, it cannot be put to use without proper analysis. For big data to be the highest value, it is essential for financial institutions to access the right data, use the right tools to analyze it, and have the necessary skills to analyze it. This is why it may be necessary for financial institutions to hire outside counsel, as they may not have the needed analytical skills.

    While participants in the study who were engaged in big data efforts had a strong foundation in certain major analytics, such as basic queries and predictive modeling, these institutions need to work more on data visualization and text analytics. The more data there is, the more important these two elements are to gaining meaning from data. Yet only three out of five respondents with big data efforts included data visualization.

    Additionally, financial institutions fall significantly behind when compared to other industries in terms of analyzing different kinds of data. Fewer than 20 percent of respondents included the ability to analyze natural text (such as call-center conversations) in their big data efforts. Text analytics allow companies to not only look at what was said, but the nuances involved in language. These allow companies to see a bigger picture of what the customer desires and how to improve customer relations. They fall even further behind from their peers in terms of other types of data, including geospatial location data and streaming data. While they may have more technology to analyze these areas, they rarely have the people with the skills necessary to apply this data.

    Recommendations

    Based on the information they generated, the research team proposed several recommendations for financial institutions and their big data use. First, they suggested that it is imperative to focus efforts on the customer: understanding your customer is the key to success in the market. Additionally, they emphasize developing a big data plan that aligns with their business’s needs and resources; while it is important to keep up with the technology, it is imperative that an effective blueprint is in place to ensure that any challenges can be addressed. This ensures that the company can address future additional data needs. Additionally, researchers suggest that initially building on already available data is key for approaching big data analytics in a pragmatic way. Businesses should also consider their own priorities for growth and pinpointing what data to look at, as opposed to just looking at what is in front of them. Finally, they should implement big data strategies by finding quantifiable measures of success. Most importantly, business leaders and technology specialists need to be able to support each other through their endeavors to implement big data plans.

    Meraglim™ is a financial technology company that uses financial data analytics to provide our clients with the information they need to remain one step ahead of everyone else. If you are curious about how our financial technology may benefit your organization, learn more here today.

  8. Smart Dust and Microelectromechanical Systems

    Imagine a world in which tiny dust particles monitor everything on earth, providing seemingly endless amounts of data that has never been accessible before. These tiny sensors would float through the air and capture information about absolutely everything, from the temperature to the chemical composition of the air to any movements to even brainwaves. The implications of this technology would be transformative for a wide variety of fields and applications, from the military to health care to safety/security monitoring to space exploration. In this brave new world, the possibilities are endless.

    Sound like something out of a science fiction novel? It’s not just fantasy; it’s called Smart Dust, and after further research, these tiny sensors could be everywhere in the near future.

    Origins

    The initial concept of Smart Dust originated from a military research project from the United States Defense Advanced Research Projects Strategy (DARPA) and the Research and Development Corporation (RAND) in the 1990s.1 In 2001, the first prototype was invented by Kristofer S.J. Pister, an electrical engineering and computer science professor at Berkeley. Pister won the Alexander Schwarzkopf Prize for Technological Innovation for his work on the Smart Dust project.2 In 2004, Pister founded Dust Networks in order to bring Smart Dust to life. In 2011, Linear Networks, an integrated circuits company, acquired Dust Networks.3

    How Smart Dust Works

    Smart Dust is a system made of motes, or tiny sensors. Motes are essentially tiny, low-power computers that can perform many different functions and are composed of microelectromechanical systems (MEMS).

    MEMS

    Microelectromechanical systems is a type of technology that can be basically defined as miniaturized electro-mechanical/mechanical components created by microfabrication.4 MEMS vary from very simple to quite complex, and are composed of tiny sensors, actuators, and microelectronics. Over the last few decades, MEMS technology has evolved to feature an incredible number of types of sensors, including temperature, chemical species, radiation, pressure, humidity, magnetic fields, and more. Interestingly, many of these microsensors function better than their macro counterparts. For example, a micro pressure transducer often outperforms the most advanced macro equivalent. Not only are these devices extremely effective, they are made with the same manufacturing techniques used to create integrated circuitry, translating to low production costs. The incredible performance of MEMS devices paired with their inexpensive cost means that this technology has integrated into the commercial marketplace. The capabilities of MEMS even today are incredible; for example, there are a variety of microactuators that have impressive capabilities, from microvalves to control liquid and gas flow to micromirror arrays for displays to micropumps to create fluid pressure. However, combining this technology with others, such as microelectronics, photonics, and nanotechnology will be the truly meteoric rise of these devices as one of the most innovative developments in technology of this century. In the future, Smart Dust will not only be able to collect data, but perform actions that will manipulative the environment around it. With the diverse potential of MEMS devices, it will be thrilling to see where Smart Dust goes in the future.

    meraglim_smartdust_innerimage2_blogComposition

    One Smart Dust mote holds a semiconductor laser diode, a beam-steering mirror, a MEMS corner-cube retro reflector, an optical receiver, and a power source composed of batteries and solar cells.5 Beyond the astounding power of MEMS, Smart Dust is also made possible by wireless communication and advanced digital circuitry. This is why it is possible for the motes to be as small as they are while containing a battery, RAM, and a wireless transmitter. The idea is that the motes should be as tiny as possible while having an advanced operating system that enables the entire system to work together.

    TinyOS

    In the world of developing open source hardware or software, there are two operating platforms that are most often used: Arduino and TinyOS. The main difference between them is that TinyOS is specifically designed for low-power sensors with wireless communication. Therefore, while Arduino is easier to use,TinyOS is the ideal operating system for Smart Dust. TinyOS provides software abstractions ideal for smart buildings, personal area networks, smart meters, and sensor networks. The main issue with TinyOS in the context of Smart Dust is that it is specifically designed to run code in short snippets for a singular function, rather than perform complex actions. So while it is great for the goal of collecting data with the motes, it is less capable of doing much in terms of powering the base center that collects the data.

    theuseofdataanalyticsinfinancialmarkets-cta

    Obstacles

    Despite the revolutionary nature of this technology, there are still obstacles to it being used as extensively right now as it could be. One obstacle is the size of the technology; while MEMS sensors are quite small, with protective casing, these are still bigger than a matchbox.6 Ideally, this technology would be tiny enough to be microscopic for a variety of purposes. Therefore, research centers in part around making this technology even smaller. Additionally, the trick for Smart Dust to be valuable is to have these sensors perform their measurements, then communicate back to a base station where data can be compiled. A way to do this reliably has been a focus of the developers in recent research. Some potential solutions include using optical transmission or using radio frequency. How exactly they will ensure reliable communication between the MEMS technology and the base center is yet to be determined.

    meraglim_smartdust_blog_innerimageImplications

    Smart Dust has astonishing possibilities for so many different industries that it’s hard to pinpoint where it will have the greatest benefit. However, the military benefits are probably the most obvious, hence why it was developed through military research. Smart Dust could enable military personnel to get critical information. For example, Smart Dust could be used to track movements from around a corner to assess whether or not there are people around the corner, and whether or not they are armed. They could receive critical information about an enemy territory, putting them at an advantage during combat. The intelligence that Smart Dust could potentially offer the military is unbelievable.

    However, Smart Dust has unlimited capabilities far outside the defense sector. The varied nature of types of sensors already afforded to us by MEMS makes it so the possible applications for Smart Dust are truly endless. For example, Smart Dust could make it so we have such precise meteorological insight that everyone would have exact information about the weather in real time. Any type of research that is impeded by wired sensors can be revolutionized with the use of Smart Dust; for example, the motes could easily go into wind tunnels, anechoic characters, or rotating machinery to acquire information. Beyond that, it has fascinating implications for biological research. For example, Smart Dust could be used to monitor internal processes of small animals such as mice or insects. This could lead to unprecedented research into diseases and the effects of medication, as well as generally give us deeper biological insight than ever possible before.

    Perhaps most radically, MEMS technology has amazing possibilities for space exploration. Smart Dust could be sent to another planet to collect data on the atmosphere and environment. It could be Smart Dust that determines that other worlds are habitable for humans. Undeniably, this has fascinating implications for the future of humanity and space travel.

    Other MEMS Projects

    Due to the obvious benefits this type of system can provide to the military, DARPA has continued to fund several different projects in the realm of MEMS. This is promising as many of the most innovative technologies of our time, including nuclear power, radar, jet engines, and the internet, developed due to military research. Out of DARPA’s Microsystems Technology Office (MTO) have come several MEMS projects. For example, DARPA recently awarded HRL Laboratories with $1.5 million to develop a low-power oven-controlled crystal oscillator (OCXO) to power atomic clocks.7 To do so, they will incorporate MEMS technology with quartz dry plasma etching techniques, which will allow developers to create more efficient and reliable atomic clocks for the military. Outside of a military application, this technology could be applied to improve GPS technology and reduce costs of producing handheld navigation systems.

    Additionally, DARPA is currently focusing energies on developing Micro Power Generation (MPG).8 As stated above, MEMS technology is currently limited by its size. A new focus is being placed on developing a way to power these devices without bulky batteries. The MPG program looks to use hydrocarbon fuels to power MEMS technology instead of the lithium-ion batteries that are currently being used. If successful, the power generator would be five to 10 times smaller than a battery of equal power, with could have incredible implications for military weapon systems and field awareness. This could also revolutionize the ways MEMS technology is used outside of the military, such as commercially or for geological or space research.

    As a financial technology company, we stay on top of the latest developments in technology so we can anticipate the changes that have a direct impact on the global money market and world at large. If you need our predictive powers, contact Meraglim™ today to learn more about how we can help your team.


  9. Why We Use Team Science

    At Meraglim™, our team is composed of top-level leaders from a variety of industries, including defense, capital markets, intelligence, science, and the private sector. Our seemingly related disciplines come together to form one of the greatest strengths we can offer our clients: effective team science. In the world of science, as technology has advanced and enabled us to work more collaboratively, the benefits of pulling knowledge of different people from different fields are well-known. Today, most scientific articles are written by six to 10 individual scientists from several different institutions. We have all benefited from this new standard, as many scientific breakthroughs have occurred due to team science that otherwise would not be possible; for example, the development of antiretroviral AIDS medications would not have occurred without team science. Naturally, there are some challenges with this model as well. When working collaboratively, communication is king; team science fails when communication does. At Meraglim™, we pride ourselves on having effective communication skills to provide financial data analytics that take a global perspective.

    Challenges

    A recent study by the National Research Council identified seven primary challenges to team science, which we have outlined below.

    meraglim_blog_innerimageMembership diversity

    To address larger issues, it requires the contribution of the minds from many different backgrounds, disciplines, and communities. For certain groups, this may cause communication issues and difficulty identifying specific goals. The diversity of team members requires members to meet each other where they are, which isn’t easy on all teams.

    Knowledge integration

    As each member brings their own unique knowledge base into the equation, there may be a lack of common ground. Particularly for transdisciplinary teams, this can be difficult as integrating different tools from a variety of areas can be less seamless than desired. While some team members are extremely literate in one theory or model, other team members may need to start from scratch learning these concepts. This can slow progress and frustrate the team.

    Size

    When it comes to team science, the size of the team matters. The larger the team, the more difficult it is to coordinate all of the moving parts. Over the past 60 years, the size of groups have expanded, increasing with it the burden of coordinating tasks and communicating. While larger groups can potentially enhance productivity by distributing small tasks more evenly among group members, it can also inhibit the level of trust and intimacy developed by the group.

    Goal alignment

    Or rather, misalignment. If team members do not share a common goal, the clarity of the project comes into question. Even if the team does share a common goal, the members may have their own, separate goals as well. This can create conflict and requires proactive management.

    Permeable boundaries

    As the project moves forward, goals may change over time, which can be see in the permeable boundaries of the team. These changes can benefit the team with additional knowledge to address any problems, it can cause disagreement within the team.

    Geography

    Team science often requires working with team members who are dispersed all over the country or world. This can present some logistical challenges, such as greater dependency on technology to communicate, working across time zones, and managing cultural expectations.

    Task interdependence

    In team science, the members are dependent on one another to accomplish tasks. Because the goal is working collaboratively, every member must contribute in a timely manner and be willing to work cooperatively. Task interdependence can often cause conflict, and may require more effort in coordinating and communicating.

    meraglim_science_blog_innerimageThe value

    Despite its challenges, team science is incredibly valuable and worth the effort to overcome these obstacles. In the world of science, individuals still contribute critical discoveries to the field, as exemplified by Stephen Hawking’s work. However, more and more, collaborative research is becoming the norm. This can be seen in research into team science, which shows that group publications are more widely cited. Additionally, groups are more likely to expand upon previous research to create new ideas that have a lasting impact. A couple of studies exemplify this point: in 2012, one transdisciplinary study on tobacco use was more widely published and received more funding than smaller, similar projects. Additionally, one 2014 study looked into the effectiveness of team science by mapping the publications from transdisciplinary research centers and found that these resources spread exponentially across different disciplines. More and more, scientists are finding that team science produces more reliable and reputable results, which has led to more funding and higher publication rates for team science efforts. Given the obvious value of team science, Meraglim™ has adopted the principles behind it to ensure that our product offers the most comprehensive information to help our clients make informed decisions.

    The Science of Team Science

    Team science is complex in the sense that it has many dimensions to it. It occurs in so many different contexts that it is hard to study it in a quantifiable way. Therefore, a new field, the science of team science, has emerged to gain more knowledge about how to make team science more effective and support the current evidence that this an effective method to problem-solving and research. Team science scientists are concerned with the following:

    • A diverse range of units of analysis in order to promote team science,
    • An understanding of the structure of collaboration throughout a range of contexts,
    • An understanding of the potential of team science,
    • An understanding of the challenges facing team science,
    • An established criteria for evaluating the outcomes and processes of team science, and
    • The educational and scientific goals of team science.

    As the intricacies of team science develop further, more and more contexts will adopt team science as their primary method. Though we are not directly involved in the scientific community, we apply these principles to our work to ensure the maximum results for your goals.

    At Meraglim™, we have brought together a team with common goals, cohesion, and expertise is a wide knowledge base. We can help your team with financial data analytics. Contact us today to learn more.