Name: | Description: | Size: | Format: | |
---|---|---|---|---|
3.02 MB | Adobe PDF |
Authors
Advisor(s)
Abstract(s)
Em 1983, Len Eidelmen apresentou num seminário de segurança computacional, o primeiro
programa informático com capacidade de se auto-replicar, conseguindo instalar-se
em vários locais do sistema. Um ano depois, o termo vírus de computador foi definido
como um programa que infecta outros programas, modicando-os para que seja possível
instalar cópias de si mesmo.
Atualmente conhecidos como malware, termo utilizado para definir uma classe de
programas informáticos criados com o intuito de causar danos ou até mesmo roubo de
informação, são nos dias de hoje, uma das ameaças constantes nas novas tecnologias, sejam
elas redes informáticas de grandes e pequenos portes, computadores pessoais e interfaces
móveis.
Fato que torna crucial a compreensão desse esse tipo de software para que possam ser
criadas medidas de segurança cada vez mais eficazes e que ajudem aos administradores
de sistemas informáticos a agirem antes que existam prejuízos de grandes proporções.
A deteção do malware é um desfio, existem muitos programas que possuem um
comportamento que se assemelha ao malware sem o serem, além disso constantes mutações
no código do malware torna esse trabalho muito mais penoso.
Este trabalho propõe o estudo das metodologias aplicadas pelos diversos investigadores
que a partir da analise de tráfego capturado em uma rede de computadores, ajudem a
reconhecer padrões que permitam criar ou modficar regras de deteção de malware.
In 1983, Len Eidelmen presented at a seminar of computer security the first computer program with the ability to self-replicate, managing to settle in various parts of the system. A year later the term computer virus was defined as a program that infects other programs by modifying them so that you can install copies of itself. Currently known as malware, a term used to define a class of computer programs created with the intent to cause damage or even theft of information are today, one of the constant threats on new technology, whether computer networks of large and small sizes, personal computers and mobile interfaces. Fact which makes it crucial to understand that this type of software to increasingly effective security measures can be created and that will help administrators of computer systems to act before there is any loss of large proportions. The malware detection is a challenge, there are many programs that have a behavior that resembles malware without being also contained mutations in the malware code makes this job much more arduous. This work proposes the study of the methodologies used by the various researchers from allowing the analysis of captured traffic on a network of computers to recognize patterns that allow us to create rules or modify existing rules in order to generate alerts when it finds some kind of abnormality usually caused by software considered suspicious or malignant.
In 1983, Len Eidelmen presented at a seminar of computer security the first computer program with the ability to self-replicate, managing to settle in various parts of the system. A year later the term computer virus was defined as a program that infects other programs by modifying them so that you can install copies of itself. Currently known as malware, a term used to define a class of computer programs created with the intent to cause damage or even theft of information are today, one of the constant threats on new technology, whether computer networks of large and small sizes, personal computers and mobile interfaces. Fact which makes it crucial to understand that this type of software to increasingly effective security measures can be created and that will help administrators of computer systems to act before there is any loss of large proportions. The malware detection is a challenge, there are many programs that have a behavior that resembles malware without being also contained mutations in the malware code makes this job much more arduous. This work proposes the study of the methodologies used by the various researchers from allowing the analysis of captured traffic on a network of computers to recognize patterns that allow us to create rules or modify existing rules in order to generate alerts when it finds some kind of abnormality usually caused by software considered suspicious or malignant.