1. The Use of Data Analytics in Financial Markets

    At Meraglim™, we take a multidimensional approach to our product. Not only do we give our clients access to our panel of prominent experts from a variety of fields, we also use an AI analytic engine to provide data analytics as a service (DAaS). More and more, companies are contracting companies to provide DAaS, as the valuable information this provides can be revolutionary for business. With the emerging technologies breaking into the market today, it’s imperative to implement the analytic tools at your disposal or risk falling behind the curve. Recently, our partner IBM collaborated with Saïd Business School at the University of Oxford to look into how banks and financial markets organizations are using data analytics to change the industry in “Analytics: The real-world use of big data in financial services.” In this blog, we will review their findings and how data can serve your team.

    The Importance of Data in Financial Markets

    For banks and financial services companies, there isn’t a physical product for them to offer. Supplying information is their trade, and data is an important resource for providing quantifiable support for their services. Within the financial services industry, there is endless data to be mined from the millions of transactions performed on a daily basis. The important advantage analyzing this data provides to financial institutions is evident; IBM and Saïd found that 71 percent of financial markets firms report that they have developed a competitive advantage by using financial data analytics. When comparing this statistic with the respondents to a similar research study by IBM two years prior, it increased by 97 percent. While banking data has grown to provide more valuable information, in our technologically advanced world, people are now banking and managing finances in a variety of ways. This unstructured data has important promise for reading into customer’s insight. This detailed information can guide investors, financial advisors, and bankers to making the best decisions for their customer base while staying compliant with regulatory laws. Companies have successfully used this data to identify business requirements and leverage the current infrastructure accordingly.


    Big Data Movements Today

    Most financial organizations today recognize the importance of big data, and are slowly implementing plans on how to use it. The majority are either currently developing a big data plan (47 percent) or are already implementing big data pilots (27 percent). In their study, IBM and Saïd found four findings that demonstrated how these companies are using big data.

    The customer is king

    More than half of industry respondents identified customer-driven goals to be their priority for big data. This stands to reason as more and more, banks are facing pressure to be customer-centric. Financial institutions must keep the customer in mind when designing their technology, operations, systems, and data analytics. Data analytics is an important tool because it enables companies to anticipate changes in the market and customer preferences to quickly take advantage of any opportunities that present themselves.

    Companies need a scalable big data model

    The research also found that the most important consideration companies must make when creating a big data model is that it must be able to accommodate the ever-growing amount of information from different sources. In a survey of these financial institutions, though only half of companies reporting said that they integrated information, IBM found that roughly 87 percent of respondents reported having the infrastructure that was necessary to accommodate the addition of more information.

    Integrating data across departments and areas has been a challenge to businesses for many years now, particularly in respect to banks due to the sheer amount of data that comes into play. This complex part of integrating big data is an essential component. It most often requires the integration of new analyzing software components, such as NoSQL and Hadoop. However, the financial industry is falling behind in this respect.

    Efforts are focused on existing sources of data

    When looking at what financial institutions and banks are doing in terms of big data efforts, the majority are focused on using the data sources they already have internally. This makes sense, because, while big data has clear and important implications for the future of these companies, they want to take a cautionary approach rather than trying to find brand new data and risking it being useless. It also speaks to practicality, as there are many uses for the internal data of these companies that is as of yet untapped.

    Most commonly, respondents to this survey were analyzing log and transactions data. Every transaction and automated function of the bank or other information system is used, which cannot be analyzed by traditional means anymore. As a result, there is years and years of data that has yet to be analyzed by these institutions. Today, the technology finally enables this information to be used, though someone with the analytical skills is also necessary.

    Banks and financial markets could catch up to their peers in terms of analyzing more varied types of data. Roughly 20 percent of respondents analyzed audio data, and about 27 percent analyzed social media. A lack of focus in unstructured data could be disrupting their ability to do better in these terms.

    Analytical ability is important

    While data in and of itself plays an important role, it cannot be put to use without proper analysis. For big data to be the highest value, it is essential for financial institutions to access the right data, use the right tools to analyze it, and have the necessary skills to analyze it. This is why it may be necessary for financial institutions to hire outside counsel, as they may not have the needed analytical skills.

    While participants in the study who were engaged in big data efforts had a strong foundation in certain major analytics, such as basic queries and predictive modeling, these institutions need to work more on data visualization and text analytics. The more data there is, the more important these two elements are to gaining meaning from data. Yet only three out of five respondents with big data efforts included data visualization.

    Additionally, financial institutions fall significantly behind when compared to other industries in terms of analyzing different kinds of data. Fewer than 20 percent of respondents included the ability to analyze natural text (such as call-center conversations) in their big data efforts. Text analytics allow companies to not only look at what was said, but the nuances involved in language. These allow companies to see a bigger picture of what the customer desires and how to improve customer relations. They fall even further behind from their peers in terms of other types of data, including geospatial location data and streaming data. While they may have more technology to analyze these areas, they rarely have the people with the skills necessary to apply this data.


    Based on the information they generated, the research team proposed several recommendations for financial institutions and their big data use. First, they suggested that it is imperative to focus efforts on the customer: understanding your customer is the key to success in the market. Additionally, they emphasize developing a big data plan that aligns with their business’s needs and resources; while it is important to keep up with the technology, it is imperative that an effective blueprint is in place to ensure that any challenges can be addressed. This ensures that the company can address future additional data needs. Additionally, researchers suggest that initially building on already available data is key for approaching big data analytics in a pragmatic way. Businesses should also consider their own priorities for growth and pinpointing what data to look at, as opposed to just looking at what is in front of them. Finally, they should implement big data strategies by finding quantifiable measures of success. Most importantly, business leaders and technology specialists need to be able to support each other through their endeavors to implement big data plans.

    Meraglim™ is a financial technology company that uses financial data analytics to provide our clients with the information they need to remain one step ahead of everyone else. If you are curious about how our financial technology may benefit your organization, learn more here today.

  2. Smart Dust and Microelectromechanical Systems

    Imagine a world in which tiny dust particles monitor everything on earth, providing seemingly endless amounts of data that has never been accessible before. These tiny sensors would float through the air and capture information about absolutely everything, from the temperature to the chemical composition of the air to any movements to even brainwaves. The implications of this technology would be transformative for a wide variety of fields and applications, from the military to health care to safety/security monitoring to space exploration. In this brave new world, the possibilities are endless.

    Sound like something out of a science fiction novel? It’s not just fantasy; it’s called Smart Dust, and after further research, these tiny sensors could be everywhere in the near future.


    The initial concept of Smart Dust originated from a military research project from the United States Defense Advanced Research Projects Strategy (DARPA) and the Research and Development Corporation (RAND) in the 1990s.1 In 2001, the first prototype was invented by Kristofer S.J. Pister, an electrical engineering and computer science professor at Berkeley. Pister won the Alexander Schwarzkopf Prize for Technological Innovation for his work on the Smart Dust project.2 In 2004, Pister founded Dust Networks in order to bring Smart Dust to life. In 2011, Linear Networks, an integrated circuits company, acquired Dust Networks.3

    How Smart Dust Works

    Smart Dust is a system made of motes, or tiny sensors. Motes are essentially tiny, low-power computers that can perform many different functions and are composed of microelectromechanical systems (MEMS).


    Microelectromechanical systems is a type of technology that can be basically defined as miniaturized electro-mechanical/mechanical components created by microfabrication.4 MEMS vary from very simple to quite complex, and are composed of tiny sensors, actuators, and microelectronics. Over the last few decades, MEMS technology has evolved to feature an incredible number of types of sensors, including temperature, chemical species, radiation, pressure, humidity, magnetic fields, and more. Interestingly, many of these microsensors function better than their macro counterparts. For example, a micro pressure transducer often outperforms the most advanced macro equivalent. Not only are these devices extremely effective, they are made with the same manufacturing techniques used to create integrated circuitry, translating to low production costs. The incredible performance of MEMS devices paired with their inexpensive cost means that this technology has integrated into the commercial marketplace. The capabilities of MEMS even today are incredible; for example, there are a variety of microactuators that have impressive capabilities, from microvalves to control liquid and gas flow to micromirror arrays for displays to micropumps to create fluid pressure. However, combining this technology with others, such as microelectronics, photonics, and nanotechnology will be the truly meteoric rise of these devices as one of the most innovative developments in technology of this century. In the future, Smart Dust will not only be able to collect data, but perform actions that will manipulative the environment around it. With the diverse potential of MEMS devices, it will be thrilling to see where Smart Dust goes in the future.


    One Smart Dust mote holds a semiconductor laser diode, a beam-steering mirror, a MEMS corner-cube retro reflector, an optical receiver, and a power source composed of batteries and solar cells.5 Beyond the astounding power of MEMS, Smart Dust is also made possible by wireless communication and advanced digital circuitry. This is why it is possible for the motes to be as small as they are while containing a battery, RAM, and a wireless transmitter. The idea is that the motes should be as tiny as possible while having an advanced operating system that enables the entire system to work together.


    In the world of developing open source hardware or software, there are two operating platforms that are most often used: Arduino and TinyOS. The main difference between them is that TinyOS is specifically designed for low-power sensors with wireless communication. Therefore, while Arduino is easier to use,TinyOS is the ideal operating system for Smart Dust. TinyOS provides software abstractions ideal for smart buildings, personal area networks, smart meters, and sensor networks. The main issue with TinyOS in the context of Smart Dust is that it is specifically designed to run code in short snippets for a singular function, rather than perform complex actions. So while it is great for the goal of collecting data with the motes, it is less capable of doing much in terms of powering the base center that collects the data.



    Despite the revolutionary nature of this technology, there are still obstacles to it being used as extensively right now as it could be. One obstacle is the size of the technology; while MEMS sensors are quite small, with protective casing, these are still bigger than a matchbox.6 Ideally, this technology would be tiny enough to be microscopic for a variety of purposes. Therefore, research centers in part around making this technology even smaller. Additionally, the trick for Smart Dust to be valuable is to have these sensors perform their measurements, then communicate back to a base station where data can be compiled. A way to do this reliably has been a focus of the developers in recent research. Some potential solutions include using optical transmission or using radio frequency. How exactly they will ensure reliable communication between the MEMS technology and the base center is yet to be determined.


    Smart Dust has astonishing possibilities for so many different industries that it’s hard to pinpoint where it will have the greatest benefit. However, the military benefits are probably the most obvious, hence why it was developed through military research. Smart Dust could enable military personnel to get critical information. For example, Smart Dust could be used to track movements from around a corner to assess whether or not there are people around the corner, and whether or not they are armed. They could receive critical information about an enemy territory, putting them at an advantage during combat. The intelligence that Smart Dust could potentially offer the military is unbelievable.

    However, Smart Dust has unlimited capabilities far outside the defense sector. The varied nature of types of sensors already afforded to us by MEMS makes it so the possible applications for Smart Dust are truly endless. For example, Smart Dust could make it so we have such precise meteorological insight that everyone would have exact information about the weather in real time. Any type of research that is impeded by wired sensors can be revolutionized with the use of Smart Dust; for example, the motes could easily go into wind tunnels, anechoic characters, or rotating machinery to acquire information. Beyond that, it has fascinating implications for biological research. For example, Smart Dust could be used to monitor internal processes of small animals such as mice or insects. This could lead to unprecedented research into diseases and the effects of medication, as well as generally give us deeper biological insight than ever possible before.

    Perhaps most radically, MEMS technology has amazing possibilities for space exploration. Smart Dust could be sent to another planet to collect data on the atmosphere and environment. It could be Smart Dust that determines that other worlds are habitable for humans. Undeniably, this has fascinating implications for the future of humanity and space travel.

    Other MEMS Projects

    Due to the obvious benefits this type of system can provide to the military, DARPA has continued to fund several different projects in the realm of MEMS. This is promising as many of the most innovative technologies of our time, including nuclear power, radar, jet engines, and the internet, developed due to military research. Out of DARPA’s Microsystems Technology Office (MTO) have come several MEMS projects. For example, DARPA recently awarded HRL Laboratories with $1.5 million to develop a low-power oven-controlled crystal oscillator (OCXO) to power atomic clocks.7 To do so, they will incorporate MEMS technology with quartz dry plasma etching techniques, which will allow developers to create more efficient and reliable atomic clocks for the military. Outside of a military application, this technology could be applied to improve GPS technology and reduce costs of producing handheld navigation systems.

    Additionally, DARPA is currently focusing energies on developing Micro Power Generation (MPG).8 As stated above, MEMS technology is currently limited by its size. A new focus is being placed on developing a way to power these devices without bulky batteries. The MPG program looks to use hydrocarbon fuels to power MEMS technology instead of the lithium-ion batteries that are currently being used. If successful, the power generator would be five to 10 times smaller than a battery of equal power, with could have incredible implications for military weapon systems and field awareness. This could also revolutionize the ways MEMS technology is used outside of the military, such as commercially or for geological or space research.

    As a financial technology company, we stay on top of the latest developments in technology so we can anticipate the changes that have a direct impact on the global money market and world at large. If you need our predictive powers, contact Meraglim™ today to learn more about how we can help your team.

  3. Why We Use Team Science

    At Meraglim™, our team is composed of top-level leaders from a variety of industries, including defense, capital markets, intelligence, science, and the private sector. Our seemingly related disciplines come together to form one of the greatest strengths we can offer our clients: effective team science. In the world of science, as technology has advanced and enabled us to work more collaboratively, the benefits of pulling knowledge of different people from different fields are well-known. Today, most scientific articles are written by six to 10 individual scientists from several different institutions. We have all benefited from this new standard, as many scientific breakthroughs have occurred due to team science that otherwise would not be possible; for example, the development of antiretroviral AIDS medications would not have occurred without team science. Naturally, there are some challenges with this model as well. When working collaboratively, communication is king; team science fails when communication does. At Meraglim™, we pride ourselves on having effective communication skills to provide financial data analytics that take a global perspective.


    A recent study by the National Research Council identified seven primary challenges to team science, which we have outlined below.

    meraglim_blog_innerimageMembership diversity

    To address larger issues, it requires the contribution of the minds from many different backgrounds, disciplines, and communities. For certain groups, this may cause communication issues and difficulty identifying specific goals. The diversity of team members requires members to meet each other where they are, which isn’t easy on all teams.

    Knowledge integration

    As each member brings their own unique knowledge base into the equation, there may be a lack of common ground. Particularly for transdisciplinary teams, this can be difficult as integrating different tools from a variety of areas can be less seamless than desired. While some team members are extremely literate in one theory or model, other team members may need to start from scratch learning these concepts. This can slow progress and frustrate the team.


    When it comes to team science, the size of the team matters. The larger the team, the more difficult it is to coordinate all of the moving parts. Over the past 60 years, the size of groups have expanded, increasing with it the burden of coordinating tasks and communicating. While larger groups can potentially enhance productivity by distributing small tasks more evenly among group members, it can also inhibit the level of trust and intimacy developed by the group.

    Goal alignment

    Or rather, misalignment. If team members do not share a common goal, the clarity of the project comes into question. Even if the team does share a common goal, the members may have their own, separate goals as well. This can create conflict and requires proactive management.

    Permeable boundaries

    As the project moves forward, goals may change over time, which can be see in the permeable boundaries of the team. These changes can benefit the team with additional knowledge to address any problems, it can cause disagreement within the team.


    Team science often requires working with team members who are dispersed all over the country or world. This can present some logistical challenges, such as greater dependency on technology to communicate, working across time zones, and managing cultural expectations.

    Task interdependence

    In team science, the members are dependent on one another to accomplish tasks. Because the goal is working collaboratively, every member must contribute in a timely manner and be willing to work cooperatively. Task interdependence can often cause conflict, and may require more effort in coordinating and communicating.

    meraglim_science_blog_innerimageThe value

    Despite its challenges, team science is incredibly valuable and worth the effort to overcome these obstacles. In the world of science, individuals still contribute critical discoveries to the field, as exemplified by Stephen Hawking’s work. However, more and more, collaborative research is becoming the norm. This can be seen in research into team science, which shows that group publications are more widely cited. Additionally, groups are more likely to expand upon previous research to create new ideas that have a lasting impact. A couple of studies exemplify this point: in 2012, one transdisciplinary study on tobacco use was more widely published and received more funding than smaller, similar projects. Additionally, one 2014 study looked into the effectiveness of team science by mapping the publications from transdisciplinary research centers and found that these resources spread exponentially across different disciplines. More and more, scientists are finding that team science produces more reliable and reputable results, which has led to more funding and higher publication rates for team science efforts. Given the obvious value of team science, Meraglim™ has adopted the principles behind it to ensure that our product offers the most comprehensive information to help our clients make informed decisions.

    The Science of Team Science

    Team science is complex in the sense that it has many dimensions to it. It occurs in so many different contexts that it is hard to study it in a quantifiable way. Therefore, a new field, the science of team science, has emerged to gain more knowledge about how to make team science more effective and support the current evidence that this an effective method to problem-solving and research. Team science scientists are concerned with the following:

    • A diverse range of units of analysis in order to promote team science,
    • An understanding of the structure of collaboration throughout a range of contexts,
    • An understanding of the potential of team science,
    • An understanding of the challenges facing team science,
    • An established criteria for evaluating the outcomes and processes of team science, and
    • The educational and scientific goals of team science.

    As the intricacies of team science develop further, more and more contexts will adopt team science as their primary method. Though we are not directly involved in the scientific community, we apply these principles to our work to ensure the maximum results for your goals.

    At Meraglim™, we have brought together a team with common goals, cohesion, and expertise is a wide knowledge base. We can help your team with financial data analytics. Contact us today to learn more.

  4. SDRs and Impending Inflation and Panic

    Globally, we are on the precipice of a financial crisis. One event that is sure to spark inflation is the inevitable mass production of Special Drawing Rights (SDR) by the International Monetary Fund (IMF). SDRs are widely misunderstood, so when this does occur, it is likely to pass unnoticed by the majority. At minimum, this will cause massive inflation and, at worst, this will cause a loss in confidence in paper money. This will lead to a surge in gold buying, which will in turn skyrocket the cost of gold. It would be wise for global leaders to reach an agreement comparable to Bretton Woods now to offset this panic. However, this is unlikely to actually happen until after the financial collapse, when they will have to reform the global monetary system during a state of panic. To fully understand what will happen upon the production of SDRs, first you must gain an understanding and what exactly they are. In this blog, we will go over everything you need to understand SDRs.

    meraglim_inflation_blog_innerimageThe role of the SDR

    In 1969, under the Bretton Woods fixed exchange rate system, the IMF created the SDR to serve as a supplementary international reserve asset. Under Bretton Woods, countries participating required reserves (government holdings of foreign currencies or gold) be used to purchase their domestic currencies in global exchange markets in order to maintain its exchange rate. However, the international supply of the two main reserve assets, the US dollar and gold, could not support the level of trade expansion taking place at that time. The SDR was born out of the need for a reserve asset controlled by the IMF.

    Shortly after the SDR was created, the Bretton Woods system dissipated, and major currencies turned to floating exchange rates. At the same time, international capital markets grew, which caused governments to borrow at a higher rate, and countries accumulated international reserves, which decreased the dependency on SDRs. During the more recent financial crises, such as the 2008 United States financial recession, SDRs totaled 182.6 billion, creating liquidity in the global economy and providing necessary reserves for several countries.

    SDRs are not currency. In actuality, they act as a potential claim on the usable currencies of members of the IMF. SDR holders can exchange their SDRs for currency in one of two ways: first, a voluntary arrangement of exchanges between IMF members; and second, when designating members of the IMF that hold strong positions externally to buy SDRs from weaker members. The SDR is also the unit of the IMDR account, as well as several other international organizations.

    meraglim_inflation_blog_innerimage2The value of the SDR

    At first, the SDR’s value was established as equivalent to 0.888671 grams of gold, the value of the US dollar at the time. In 1973, when Bretton Woods collapsed, the IMF redefined the SDR as a basket of currencies. Currently, the SDR includes the dollar, the euro, the renminbi, the yen, and the sterling.

    The value of the SDR in relation to the dollar changes daily, and is posted on the IMF website. It’s value is the sum of each basket currency as valued in dollars, based on the exchange rates quoted in the London market daily at noon.

    Every five years, the Executive Board reviews the composition of this basket of currencies, unless an event occurs that causes the IMF to believe an earlier review is required. This is to make sure that the SDR reflects currencies’ importance within the global economy. The most recent review occurred in November 2015, where they determined that starting October 2016, the Chinese renminbi was to be included in the basket. During this review, they also implemented a new weighting formula. This formula assigns equal shares to the exports of the currency issuer and a financial indicator. This indicator is composed of equal shares of official reserves denominated in the country’s currency that are also held by authorities that do not issue the currency, the foreign exchange turnover, and the sum of outstanding international liabilities and debt securities in the currency.

    The breakdown of weights in the baskets currency are:

    • US dollar: 41.73 percent
    • Euro: 30.93 percent
    • Chinese renminbi: 10.92 percent
    • Japanese yen: 8.33 percent
    • Pound sterling: 8.09 percent

    The weight of each currency determines the amount of each of these currencies that is included in the valuation basket that was put into place in October of 2016. These amounts are fixed for the next five years until the next SDR valuation. As currency amounts are a fixed figure, the relative weight can change during valuation periods, with the weight risings when currencies appreciate relative to the other currencies, and conversely, currencies’ weight falls as it depreciates in comparison. The next review will occur before October 2021.

    The SDR interest rate

    The interest rate is the basis of calculating the interest that is charged to borrowing members, as well as the interest paid to members for providing resources for IMF loans. The interest rate also determines how much is paid to members for SDR holdings. The SDR interest rate varies weekly based on the weight average of interest rates on debt instruments in the money markets of currencies within the SDR basket.

    Who receives SDRs

    The IMF can allocate SDRs to member countries in portion relative to their IMF quota. This provides an unconditional reserve asset to each member. SDRs are self-financing and raise charges on allocations, which in turn are used to pay the interest on SDR holdings. Should a member not use their SDR holdings, charges are equivalent to interest received. On the other hand, if their SDR holdings are above their allocation, the member country earns interest on this excess. Alternatively, if a member holds fewer SDRs than previously allocated, interest is paid on the shortfall. SDR cancellations are also permitted, but have never been used.

    There is also a condition that allows the IMF to give SDRs to non-members including such organizations as the Bank of International Settlements (BIS) and the European Central Bank (ECB). These holders can hold and SDRs for transactions with members or other prescribed holders. Additionally, the IMF cannot allocate SDRs to itself.

    A general SDR allocation must be based on a global need to add to reserve assets. General SDR allocations can be made for periods up to five years, but they have only been made three times, once from 1970-72, once in 1979-98, and once in 2009.

    On August 10, 2009, a one-time allocation of 21.5 billion SDRs was made. This allowed all IMF members to join the SDR system on equity, and addressed the issue of countries who joined after 1981 not receiving an SDR allocation until 2009. Together, the allocations are 204.1 billion SDR.

    Buying/selling SDRs

    Sometimes, IMF members need SDRs for IMF obligations, or they may sell SDRs to address their reserve composition. In these cases, the IMF works as an intermediary between the members in order to make sure that SDRs are exchanged for usable currency. For the last twenty years, the SDR market has worked solely with voluntary trade agreements. After the 2009 general allocations, voluntary arrangements expanded for the sake of liquidity in the market. There are now 32 voluntary SDR trading arrangements, with 19 new since 2009. These arrangements have been the key to SDR liquidity since 1987. However, if there is a situation in which there is not enough capacity under these trading arrangements, there is the designation mechanism in place. This mechanism allows strong members (as defined by the IMF) to buy SDRs with currencies up to certain amounts from weak members. This is to ensure liquidity and the reserve asset value of the SDR.

    At Meraglim™, our comprehensive understanding the intricacies of the global economy can serve you and your team. When you need financial data analytics, you can rely on our panel of experts paired with our innovative risk assessment software to provide you with the information you need. Interested in how we can help your team? Contact us today.

  5. Debt Deleveraging and the 2008 Financial Crisis

    With the 2008 burst of the global credit bubble sparking the first global financial recession since the Great Depression, governments everywhere face an overwhelming amount of debt, making recovery a daunting task, even nearly a decade later. Debt still grows; in fact, every major economy has more debt than they did in 2007. Global government officials and business leaders must now look to how to prevent crises in the future, and how to deleverage the debt they have accrued. Since the 2008 financial crisis, the McKinsey Global Institute (MGI) has been conducting research into the implications of debt deleveraging and its consequences on the global economy. In this blog, we will go over some of their key findings, and how this knowledge can help leaders, both global and in business, to make educated decisions.

    meraglim_debtdeleveraging_blog_innerimage3Rising Global Debt

    Through an analysis of the debt of 22 advanced countries and 25 developing countries, the MGI found that debt throughout the world has outpaced the GDP growth, rising by $57 trillion from 2007 to 2014. The debt-to-GDP ratio of all advanced economies they studied rose, with a significant number of them rising by more than 50 percent.

    Emerging Risks

    Through this research, MGI identified three emerging risks that require our attention:

    • Rising government debt, some of which is so severe, there will need to be new ways of reducing it invented
    • Rising household debt and housing prices, which are at an all-time high in Northern Europe and Asia
    • China’s skyrocketing debt, which quadrupled in the span of seven years

    Government debt

    In some countries, government debt is higher than can be sustained. Government debt alone has risen by $25 trillion since 2007, and given the current economic environment, this is unlikely to stop in many countries. Some debt is directly from the crisis in places where global leaders funded stimulus programs and bailouts. Others are due to the recession and poor recovery. For the six most indebted countries, debt deleveraging would cause unrealistically huge increases in GDP growth. Therefore, new strategies will have to be put in place for these governments, such as wealth waxes, asset sales, and debt restructuring.

    Household debt

    Global household debt has reached an all-time high. Only four countries (Ireland, the UK, Spain, and the US) have seen household debts deleverage. The majority of others have found the debt-to-income ratios steadily rise. These ratios exceed the highest levels they were before 2008 in many advanced countries, such as Australia, Denmark, the Netherlands, Malaysia, Thailand, and Canada. A priority of these governments must be to manage household debt. Some ways they can address this is with tighter lending standards, flexible mortgages, and clarity around personal-bankruptcy laws.

    China’s debt

    From 2007 to 2014, China’s debt quadrupled. In 2007, their debt was $7 trillion; in 2014, it was $28 trillion. This change has revealed three troubling developments:

    • Half of all loans are linked in some way to China’s real-estate market,
    • Almost half of new lending is through unregulated shadow bank accounts
    • Local government’s debts are unsustainable.

    Fortunately, according to MGI’s calculations, China could bail out the financial crisis in the event of a property-related debt crisis. The key to ensuring that this remains true is to prevent further debt increases and risks of a crisis, yet not inhibiting economic growth.

    meraglim_debtdeleveraging_blog_innerimageDebt Started Growing Before 2008

    When analyzing the financial crisis of 2008, most point to the mortgage lending and financial sector leverage of the United States. However, MGI sees a bigger picture that includes factors that occurred before 2008 that allowed the crisis to happen. For example, the globalization of banking and strangely low interest rates grew debt quickly after 2000 in many major countries. Several countries had higher debt-to-GDP ratios than the United States before 2008. However, this does not tell us much about current leverage levels and how sustainable they are.

    Before 2007, households increased borrowing, particularly through mortgages. Housing prices rose, which made it seem as though the debt-to-asset ratio remained steady. However, household debt when compared to disposable income escalated. Businesses reached a crisis with lower leverage by 2000, with the exception of commercial real estate and companies that were bought using leveraged buyouts. Before the crisis, government debt was flat and in some countries, even declining.

    In the financial sector, leverage growth varied in different countries. Bank leverage increased moderately when compared to historic levels. There were only certain specific areas of the financial sector that increased in leverage before the crisis. Additionally, the quality of capital in many banks deteriorated because they used hybrid forms for common equity. However, common equity was the only capital that absorbed losses. As there are many incentives for banks to replace equity with debt, raising the amount of common equity required for banks may help improve the quality of capital.

    meraglim_debtdeleveraging_blog_innerimage2Deleveraging Today

    Though the 2008 crisis halted credit growth, deleveraging has only just begun. In 2009, the total debt-to-GDP ratio fell slightly in some countries, such as the US, UK, and South Korea. The small amount of deleveraging may be due to the skyrocketing government debt, which offset declines in household debt.

    In contrast, the financial sector’s leverage has fallen to the levels prior to the crisis. By the second quarter of 2009, most banking systems had deleveraged to the point of the levels being the same as or above the levels of the preceding 15 years. Whether more capital will be needed on top of what has been accrued by banks is yet to be seen. Any capital requirement boost should be approached with caution given the high chances of deleveraging as to prevent too much reduction of credit provision.

    Meraglim™ is a financial technology company that provides global leaders and institutional investors with the information they need, bringing together our collective expertise and innovative risk analysis software. Using the latest research, our panel of experts from the worlds of defense, law, intelligence, and the private sector come together to provide the information you need to make informed decisions. If you’re interested to see what Meraglim™ can offer your team, contact us.