Science Helps to Reduce Cyber Crimes

Computers as well as the Internet have become essential to both businesses and households. Their dependence increases every day, whether for home users, crucial areas like management of power grids medical applications, or finance systems in corporate settings. In addition, there are the difficulties in the ongoing and reliable service delivery that is becoming a greater problem for companies. Cyber security is at the top of the risks that organisations face, with the majority of respondents rating it as higher than the risk of terrorist attacks or natural disasters.

Despite the attention Cyber security has received and received, it’s been a gruelling journey thus far. The worldwide spending on IT Security is anticipated to exceed $120 billion in 2017 , and that is one of the areas where the IT budgets of the majority of companies has remained the same or even increased, even during the recent financial crisis. However, this hasn’t significantly diminished the vulnerability in software or the attacks of criminal organizations.

It is time to completely rethink how we go about to protecting your IT system. Security is fragmented and focused on single solutions that have been used for specific threats such as anti-virus and anti-spam filters, intrusion detection, and firewalls. However, we are in a phase where Cyber systems are more than just wires and tin and software. They are systemic and have an economic, social and political element. The fact that systems are interconnected, coupled with a human aspect, makes IT systems incomparable from human factors. Complex cyber systems today have an existence of their own. cyber systems can be described as complex, adaptive systems we’ve tried to understand and address with the help of more conventional theories. For Cyber Security News it is very important to be updated.

Before we get into the reasons of the concept of treating the Cyber systems as Complex system, here’s an outline explanation of what is a Complex systems is. It is important to note that “system” co ld refer to any combination of people, processes or technologies which serves a particular objective. The wrist watch you’re using, your sub-oceanic coral reefs or the economy of a country are just a few examples of an “system”.

In a nutshell the definition of the term “complex system” refers to complex System is a system where the elements in the system, and the interactions between them are a part of a particular behavior, so that an analysis of its components can’t explain the behavior. In these systems, the causes and effect cannot necessarily be linked and the interactions are not linear. A small modification could have a significant impact. Also that, as Aristotle stated “the totality is more than the total of its components”. A few of the well-known examples in this context is that of the city’s traffic system as well as the development of traffic jams. study of the individual car and car drivers is not able to determine the pattern and the development in traffic congestion.

A complex adaptive System (CAS) also exhibits characteristics of self-learning, evolution, and emergence between the agents in the system. Agents or participants in the CAS exhibit diverse behaviour. Their behavior and interactions with other agents are constantly changing. The most important characteristics of an agent to be classified as complex Adaptive are:

that are characterized by “complicated” processes. Complex processes are those that produces unpredictable results no matter how easy the steps appear. A complex process is one which involves many intricate processes and challenging conditions, but with a certain result. One of the most common examples is that making tea is complicated (at at least my experience… I’ve never had an experience that is exactly the same as the first one) Building cars is complicated. David Snowden’s Cynefin framework offers an explanation of the concepts.

Today, IT systems are created and developed by us (as humans in the human society of IT employees within an organization, as well as vendors) together. And we possess all the knowledge we are able to be able to gather about the systems. So why do we encounter fresh attacks against IT systems each day that we would never have anticipated, attacking vulnerabilities we didn’t even know existed? One reason could be the fact that an IT system is developed by thousands of people across the entire technology stack starting from the business applications down to the components of the network and the hardware it runs on. This introduces a significant human component to the creation of Cyber systems. As a result, possibilities are everywhere for the introduction of weaknesses that can lead to security vulnerabilities.

A majority of companies possess multiple layers of protection for their most critical systems (layers of firewalls IDS, a hardened O/S as well as strong authentication) however, cyber-attacks still occur. The majority of the time computer hacks are an outcome of a series of events rather than a vulnerability that is that is exploited to allow an attack to be successful. It’s an “whole” of events and the actions of the attackers that create the harm.

Reductionism as well as Holism are two opposing philosophical perspectives for the study as well as the design and construction of an system or object. The Reductionists claim the possibility of reducing any object down to its components and analyzed using “reducing” it to its components and the Holists insist that the total is more than the sum and therefore, a system cannot be understood solely by analyzing the parts of it.

Reductionists claim that all machines and systems can be understood through examining its components. A majority of the current sciences and analytical methods are founded on the reductionist method which is fair enough since they have served us well to date. If you understand what each component does, you are able to understand what a wristwatch could accomplish, and by designing each piece separately it is possible to have a car behave in as you would like it to and by analyzing the location of celestial objects, we can precisely determine the time of the coming Solar eclipse. Reductionism focuses heavily on causality, meaning that there is a cause for an effect.

This is the extent to which a reductionist perspective can be used to explain the behavior of the system. In the case of emerging systems such as human behavior Social-economic systems, biological systems , or Socio-cyber networks, the reductionist method is not without its flaws. Simple examples, such as our human bodies, reaction of a crowd to a stimulus from a politician or the reaction of the financial market to announcement of a merger or even traffic jams – can’t be predicted, even when looking at the behavior of the members who comprise each of these “systems”.

We’ve always examined Cyber security using an Reductionist lens, with specific solutions to specific issues and attempted to anticipate what attacks the cybercriminals could launch against vulnerabilities that are known to be vulnerable. It’s about time to look at Cyber security using an alternative Holism approach too.

Computer burglaries are more akin to an infection that is bacterial or viral as opposed to a car or home break-in. The burglar who breaks into a house isn’t able to make use of it as a way to attack neighbours. The vulnerability of one lock system on the car be exploited by millions of other locks across the world at the same time. They’re more like bacteria that infect the human body.

They are able to spread the infection just as humans do. They could affect vast areas of a species for as long as they’re “connected” to one another and, in the event of severe illnesses, the systems are typically isolated; and the people who are put in quarantine to stop the spread of infection [99. The lexicon used by Cyber systems employs biological metaphors like viruses, worms, infections and more. There are many parallels to epidemiology, however the principles of design that are commonly used in Cyber systems aren’t in line with natural selection rules. Cyber systems are based on the consistency of technology and processes in comparison to the diversity of genes found in the species of organisms that makes the species more resistant to attacks of epidemic.

The Flu pandemic in 1918 claimed the lives of a quarter of a million people, far more that the Great War itself. The majority of the world population was affected, but how did it affect the 20-40 year olders more than other age groups? It could be due to differences in body’s structure, which causes a an a different response to attacks?

The theory of complexity has gained a lot of popularity and proved to be extremely useful in understanding epidemiology, the patterns of the spread of diseases and methods of stopping these. Researchers are now focusing on applying their knowledge from the cyber systems to the natural sciences.

In the past, there have been two distinct and complementary approaches for reducing security threats to Cyber systems being used today in the majority of modern systems.

This method mostly relies on the testing team of an IT system to identify any weaknesses within the system that may reveal a vulnerability that could be exploited by hackers. This can be functional testing to confirm that the system provides the correct response as expected and penetration testing to verify its resistance to specific attacks, or tests for availability and resilience. The purpose of these tests is usually the system it self and not the defenses that are positioned around it.

This can be a good option for self-contained systems that are relatively simple where user-related routes are pretty simple. For the majority of interconnected systems, formal validation isn’t enough since it’s impossible to “test everything”.

If a system is not completely tested through formal testing procedures we implement additional layers of defense such as Firewalls or segregation of networks, or create virtual machines that are not visible to of the of the network, etc. Other methods of security mechanisms include Intrusion Prevention systems, Anti-virus and so on.

Resilience to changes is an important characteristic of biological systems that is a natural phenomenon. Imagine a whole species of living things sharing the identical genetic structure, body structure, and similar immune system and antibodies – the spread of a virus infection would have killed the all of the population. It’s not the case because we all are created differently and all are different in our resistances to illnesses.

In addition, certain mission-critical Cyber systems, particularly for those in the Aerospace and Medical industry implement “diversity implementations” of the same function. A central voting function determines the decision to respond to the requester in the event that the results from different implementations are not in line.

It’s commonplace to have duplicate copies of mission-critical systems in organizations, but they’re generally homogenous in their implementations instead of diverse, making them equally vulnerable to the vulnerabilities and faults as the main ones. If the application of the duplicate systems differs from the primary one – for example, an alternative OS, different applications containers or databases and the two versions will have different levels of resistance to attacks of a certain kind. A change in the order of access to memory stacks could modify the response to an attack by a buffer overflow the various variants, which could alert the central voting system to indicate that something is not right somewhere. So long as the input data and business purpose of the implementation are identical Any deviation in the responses of the implementations could indicate a possible attack. If a true service-based design is in place, each’service can have multiple (but very few) different implementations, and the business function can randomly choose which service implementation it chooses to use for each new request from a user. There are a lot of different execution routes could be created using this method and increase the resiliency to the overall system.

Multi-variant Execution Environments (MVEE) were created, in which applications that differ in execution run in lockstep, and their responses to a request is monitored [12The response to a request is monitored [12. These have proved extremely effective in intrusion detection by seeking to alter the behavior of the program or even identifying issues where the variants react differently to requests.

In a similar vein using the N-version programming model, an N-version antivirus was created in the University of Michigan that had diverse implementations that looked at new files in search of the appropriate virus signatures. It was the result of a more robust antivirus system that was less vulnerable to attacks on itself , and 35% more effective in detecting across the entire estate

One of the major areas of research of study Complexity research includes Agent Based Modelling, a method of simulation modeling.

Agent Based Modeling is a technique for simulation modelling that is used to study and understand the behavior of Complex systems, particularly Complex adaptive systems. The people or groups that interact with one another in the Complex system are represented as artificial agents who operate according to a predefined rules. Agents are able to change their behaviour and alter their behaviour according to the conditions. Contrary to Deductive reasoning , which is the most widely used method to explain the behavior of economic and social system, Simulation does not try to generalize the system and agents’ behavior.

ABMs have become quite well-known to study issues such as crowd management in the event of fire evacuations and spread of epidemics to understand market behavior and, more recently, the analysis of financial risks. ABM is a bottom-up modeling technique in which the behavior for each participant is controlled independently, and may be distinct from other agents. The self-learning and evolutionary behaviour of agents can be achieved by using a variety of methods, Genetic Algorithm implementation being one of the most popular.

Cyber systems involve interconnections between software modules as well as the wiring of logic circuits microchips as well as the Internet and many customers (system consumers or the end user). The interactions and actors could be simulated in a model to conduct what-if analyses, anticipate the effect on parameters that change and interact between the various actors in the model. Simulators have been employed to analyze the performance of applications based on characteristics and user behavior for quite a while now and a few of the top Performance management and capacity software tools employ the method. Similar methods can be used to analyze the responses cyber system to attacks, creating an architecture that is fault-tolerant and analyzing the degree of emergent resilience due to the diversity of the implementation.

One of the major areas of research within Agent Based modelling is the “self-learning” process of agents. Real-world, the actions of an attacker will change in response to the experience of an attacker. This element of the behavior of an agent is influenced by an agent’s learning process Genetic Algorithms is an extremely widely used methods for this.

Leave a Comment

Your email address will not be published. Required fields are marked *