Targeted attack detection by means of free and open source solutions
- Authors: Bernardo, Louis F
- Date: 2019
- Subjects: Computer networks -- Security measures , Information technology -- Security measures , Computer security -- Management , Data protection
- Language: English
- Type: text , Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/92269 , vital:30703
- Description: Compliance requirements are part of everyday business requirements for various areas, such as retail and medical services. As part of compliance it may be required to have infrastructure in place to monitor the activities in the environment to ensure that the relevant data and environment is sufficiently protected. At the core of such monitoring solutions one would find some type of data repository, or database, to store and ultimately correlate the captured events. Such solutions are commonly called Security Information and Event Management, or SIEM for short. Larger companies have been known to use commercial solutions such as IBM's Qradar, Logrythm, or Splunk. However, these come at significant cost and arent suitable for smaller businesses with limited budgets. These solutions require manual configuration of event correlation for detection of activities that place the environment in danger. This usually requires vendor implementation assistance that also would come at a cost. Alternatively, there are open source solutions that provide the required functionality. This research will demonstrate building an open source solution, with minimal to no cost for hardware or software, while still maintaining the capability of detecting targeted attacks. The solution presented in this research includes Wazuh, which is a combination of OSSEC and the ELK stack, integrated with an Network Intrusion Detection System (NIDS). The success of the integration, is determined by measuring postive attack detection based on each different configuration options. To perform the testing, a deliberately vulnerable platform named Metasploitable will be used as a victim host. The victim host vulnerabilities were created specifically to serve as target for Metasploit. The attacks were generated by utilising Metasploit Framework on a prebuilt Kali Linux host.
- Full Text:
- Date Issued: 2019
- Authors: Bernardo, Louis F
- Date: 2019
- Subjects: Computer networks -- Security measures , Information technology -- Security measures , Computer security -- Management , Data protection
- Language: English
- Type: text , Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/92269 , vital:30703
- Description: Compliance requirements are part of everyday business requirements for various areas, such as retail and medical services. As part of compliance it may be required to have infrastructure in place to monitor the activities in the environment to ensure that the relevant data and environment is sufficiently protected. At the core of such monitoring solutions one would find some type of data repository, or database, to store and ultimately correlate the captured events. Such solutions are commonly called Security Information and Event Management, or SIEM for short. Larger companies have been known to use commercial solutions such as IBM's Qradar, Logrythm, or Splunk. However, these come at significant cost and arent suitable for smaller businesses with limited budgets. These solutions require manual configuration of event correlation for detection of activities that place the environment in danger. This usually requires vendor implementation assistance that also would come at a cost. Alternatively, there are open source solutions that provide the required functionality. This research will demonstrate building an open source solution, with minimal to no cost for hardware or software, while still maintaining the capability of detecting targeted attacks. The solution presented in this research includes Wazuh, which is a combination of OSSEC and the ELK stack, integrated with an Network Intrusion Detection System (NIDS). The success of the integration, is determined by measuring postive attack detection based on each different configuration options. To perform the testing, a deliberately vulnerable platform named Metasploitable will be used as a victim host. The victim host vulnerabilities were created specifically to serve as target for Metasploit. The attacks were generated by utilising Metasploit Framework on a prebuilt Kali Linux host.
- Full Text:
- Date Issued: 2019
Evaluating the cyber security skills gap relating to penetration testing
- Authors: Beukes, Dirk Johannes
- Date: 2021
- Subjects: Computer networks -- Security measures , Computer networks -- Monitoring , Computer networks -- Management , Data protection , Information technology -- Security measures , Professionals -- Supply and demand , Electronic data personnel -- Supply and demand
- Language: English
- Type: text , Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/171120 , vital:42021
- Description: Information Technology (IT) is growing rapidly and has become an integral part of daily life. It provides a boundless list of services and opportunities, generating boundless sources of information, which could be abused or exploited. Due to this growth, there are thousands of new users added to the grid using computer systems in a static and mobile environment; this fact alone creates endless volumes of data to be exploited and hardware devices to be abused by the wrong people. The growth in the IT environment adds challenges that may affect users in their personal, professional, and business lives. There are constant threats on corporate and private computer networks and computer systems. In the corporate environment companies try to eliminate the threat by testing networks making use of penetration tests and by implementing cyber awareness programs to make employees more aware of the cyber threat. Penetration tests and vulnerability assessments are undervalued; are seen as a formality and are not used to increase system security. If used regularly the computer system will be more secure and attacks minimized. With the growth in technology, industries all over the globe become fully dependent on information systems in doing their day-to-day business. As technology evolves and new technology becomes available, the bigger the risk becomes to protect against the dangers which come with this new technology. For industry to protect itself against this growth in technology, personnel with a certain skill set is needed. This is where cyber security plays a very important role in the protection of information systems to ensure the confidentiality, integrity and availability of the information system itself and the data on the system. Due to this drive to secure information systems, the need for cyber security by professionals is on the rise as well. It is estimated that there is a shortage of one million cyber security professionals globally. What is the reason for this skills shortage? Will it be possible to close this skills shortage gap? This study is about identifying the skills gap and identifying possible ways to close this skills gap. In this study, research was conducted on the cyber security international standards, cyber security training at universities and international certification focusing specifically on penetration testing, the evaluation of the need of industry while recruiting new penetration testers, finishing with suggestions on how to fill possible gaps in the skills market with a conclusion.
- Full Text:
- Date Issued: 2021
- Authors: Beukes, Dirk Johannes
- Date: 2021
- Subjects: Computer networks -- Security measures , Computer networks -- Monitoring , Computer networks -- Management , Data protection , Information technology -- Security measures , Professionals -- Supply and demand , Electronic data personnel -- Supply and demand
- Language: English
- Type: text , Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/171120 , vital:42021
- Description: Information Technology (IT) is growing rapidly and has become an integral part of daily life. It provides a boundless list of services and opportunities, generating boundless sources of information, which could be abused or exploited. Due to this growth, there are thousands of new users added to the grid using computer systems in a static and mobile environment; this fact alone creates endless volumes of data to be exploited and hardware devices to be abused by the wrong people. The growth in the IT environment adds challenges that may affect users in their personal, professional, and business lives. There are constant threats on corporate and private computer networks and computer systems. In the corporate environment companies try to eliminate the threat by testing networks making use of penetration tests and by implementing cyber awareness programs to make employees more aware of the cyber threat. Penetration tests and vulnerability assessments are undervalued; are seen as a formality and are not used to increase system security. If used regularly the computer system will be more secure and attacks minimized. With the growth in technology, industries all over the globe become fully dependent on information systems in doing their day-to-day business. As technology evolves and new technology becomes available, the bigger the risk becomes to protect against the dangers which come with this new technology. For industry to protect itself against this growth in technology, personnel with a certain skill set is needed. This is where cyber security plays a very important role in the protection of information systems to ensure the confidentiality, integrity and availability of the information system itself and the data on the system. Due to this drive to secure information systems, the need for cyber security by professionals is on the rise as well. It is estimated that there is a shortage of one million cyber security professionals globally. What is the reason for this skills shortage? Will it be possible to close this skills shortage gap? This study is about identifying the skills gap and identifying possible ways to close this skills gap. In this study, research was conducted on the cyber security international standards, cyber security training at universities and international certification focusing specifically on penetration testing, the evaluation of the need of industry while recruiting new penetration testers, finishing with suggestions on how to fill possible gaps in the skills market with a conclusion.
- Full Text:
- Date Issued: 2021
A cyclic approach to business continuity planning
- Authors: Botha, Jacques
- Date: 2002
- Subjects: Data recovery (Computer science) -- Planning , Data protection , Emergency management
- Language: English
- Type: Thesis , Masters , MTech (Information Technology)
- Identifier: vital:10788 , http://hdl.handle.net/10948/81 , Data recovery (Computer science) -- Planning , Data protection , Emergency management
- Description: The Information Technology (IT) industry has grown and has become an integral part in the world of business today. The importance of information, and IT in particular, will in fact only increase with time (von Solms, 1999). For a large group of organizations computer systems form the basis of their day-to-day functioning (Halliday, Badendorst & von Solms, 1996). These systems evolve at an incredible pace and this brings about a greater need for securing them, as well as the organizational information processed, transmitted and stored. This technological evolution brings about new risks for an organization’s systems and information (Halliday et. al., 1996). If IT fails, it means that the business could fail as well, creating a need for more rigorous IT management (International Business Machines Corporation, 2000). For this reason, executive management must be made aware of the potential consequences that a disaster could have on the organisation (Hawkins,Yen & Chou, 2000). A disaster could be any event that would cause a disruption in the normal day-to-day functioning of an organization. Such an event could range from a natural disaster, like a fire, an earthquake or a flood, to something more trivial, like a virus or system malfunction (Hawkins et. al., 2000). During the 1980’s a discipline known as Disaster Recovery Planning (DRP) emerged to protect an organization’s data centre, which was central to the organisation’s IT based structure, from the effects of disasters. This solution, however, focussed only on the protection of the data centre. During the early 1990’s the focus shifted towards distributed computing and client/server technology. Data centre protection and recovery were no longer enough to ensure survival. Organizations needed to ensure the continuation of their mission critical processes to support their continued goal of operations (IBM Global Services, 1999). Organizations now had to ensure that their mission critical functions could continue while the data centre was recovering from a disaster. A different approach was required. It is for this reason that Business Continuity Planning (BCP) was accepted as a formal discipline (IBM Global Services, 1999). To ensure that business continues as usual, an organization must have a plan in place that will help them ensure both the continuation and recovery of critical business processes and the recovery of the data centre, should a disaster strike (Moore, 1995). Wilson (2000) defines a business continuity plan as “a set of procedures developed for the entire enterprise, outlining the actions to be taken by the IT organization, executive staff, and the various business units in order to quickly resume operations in the event of a service interruption or an outage”. With markets being highly competitive as they are, an organization needs a detailed listing of steps to follow to ensure minimal loss due to downtime. This is very important for maintaining its competitive advantage and public stature (Wilson, 2000). The fact that the company’s reputation is at stake requires executive management to take continuity planning very serious (IBM Global Services, 1999). Ensuring continuity of business processes and recovering the IT services of an organization is not the sole responsibility of the IT department. Therefore management should be aware that they could be held liable for any consequences resulting from a disaster (Kearvell-White, 1996). Having a business continuity plan in place is important to the entire organization, as everyone, from executive management to the employees, stands to benefit from it (IBM Global Services, 1999). Despite this, numerous organizations do not have a business continuity plan in place. Organizations neglecting to develop a plan put themselves at tremendous risk and stand to loose everything (Kearvell-White, 1996).
- Full Text:
- Date Issued: 2002
- Authors: Botha, Jacques
- Date: 2002
- Subjects: Data recovery (Computer science) -- Planning , Data protection , Emergency management
- Language: English
- Type: Thesis , Masters , MTech (Information Technology)
- Identifier: vital:10788 , http://hdl.handle.net/10948/81 , Data recovery (Computer science) -- Planning , Data protection , Emergency management
- Description: The Information Technology (IT) industry has grown and has become an integral part in the world of business today. The importance of information, and IT in particular, will in fact only increase with time (von Solms, 1999). For a large group of organizations computer systems form the basis of their day-to-day functioning (Halliday, Badendorst & von Solms, 1996). These systems evolve at an incredible pace and this brings about a greater need for securing them, as well as the organizational information processed, transmitted and stored. This technological evolution brings about new risks for an organization’s systems and information (Halliday et. al., 1996). If IT fails, it means that the business could fail as well, creating a need for more rigorous IT management (International Business Machines Corporation, 2000). For this reason, executive management must be made aware of the potential consequences that a disaster could have on the organisation (Hawkins,Yen & Chou, 2000). A disaster could be any event that would cause a disruption in the normal day-to-day functioning of an organization. Such an event could range from a natural disaster, like a fire, an earthquake or a flood, to something more trivial, like a virus or system malfunction (Hawkins et. al., 2000). During the 1980’s a discipline known as Disaster Recovery Planning (DRP) emerged to protect an organization’s data centre, which was central to the organisation’s IT based structure, from the effects of disasters. This solution, however, focussed only on the protection of the data centre. During the early 1990’s the focus shifted towards distributed computing and client/server technology. Data centre protection and recovery were no longer enough to ensure survival. Organizations needed to ensure the continuation of their mission critical processes to support their continued goal of operations (IBM Global Services, 1999). Organizations now had to ensure that their mission critical functions could continue while the data centre was recovering from a disaster. A different approach was required. It is for this reason that Business Continuity Planning (BCP) was accepted as a formal discipline (IBM Global Services, 1999). To ensure that business continues as usual, an organization must have a plan in place that will help them ensure both the continuation and recovery of critical business processes and the recovery of the data centre, should a disaster strike (Moore, 1995). Wilson (2000) defines a business continuity plan as “a set of procedures developed for the entire enterprise, outlining the actions to be taken by the IT organization, executive staff, and the various business units in order to quickly resume operations in the event of a service interruption or an outage”. With markets being highly competitive as they are, an organization needs a detailed listing of steps to follow to ensure minimal loss due to downtime. This is very important for maintaining its competitive advantage and public stature (Wilson, 2000). The fact that the company’s reputation is at stake requires executive management to take continuity planning very serious (IBM Global Services, 1999). Ensuring continuity of business processes and recovering the IT services of an organization is not the sole responsibility of the IT department. Therefore management should be aware that they could be held liable for any consequences resulting from a disaster (Kearvell-White, 1996). Having a business continuity plan in place is important to the entire organization, as everyone, from executive management to the employees, stands to benefit from it (IBM Global Services, 1999). Despite this, numerous organizations do not have a business continuity plan in place. Organizations neglecting to develop a plan put themselves at tremendous risk and stand to loose everything (Kearvell-White, 1996).
- Full Text:
- Date Issued: 2002
An information privacy model for primary health care facilities
- Authors: Boucher, Duane Eric
- Date: 2013
- Subjects: Data protection , Privacy, Right of , Medical records -- Access control , Primary health care , Medical care , Caregivers , Community health nursing , Confidential communications , Information technology -- Management
- Language: English
- Type: Thesis , Masters , MCom (Information Systems)
- Identifier: vital:11139 , http://hdl.handle.net/10353/d1007181 , Data protection , Privacy, Right of , Medical records -- Access control , Primary health care , Medical care , Caregivers , Community health nursing , Confidential communications , Information technology -- Management
- Description: The revolutionary migration within the health care sector towards the digitisation of medical records for convenience or compliance touches on many concerns with respect to ensuring the security of patient personally identifiable information (PII). Foremost of these is that a patient’s right to privacy is not violated. To this end, it is necessary that health care practitioners have a clear understanding of the various constructs of privacy in order to ensure privacy compliance is maintained. This research project focuses on an investigation of privacy from a multidisciplinary philosophical perspective to highlight the constructs of information privacy. These constructs together with a discussion focused on the confidentiality and accessibility of medical records results in the development of an artefact represented in the format of a model. The formulation of the model is accomplished by making use of the Design Science research guidelines for artefact development. Part of the process required that the artefact be refined through the use of an Expert Review Process. This involved an iterative (three phase) process which required (seven) experts from the fields of privacy, information security, and health care to respond to semi-structured questions administered with an interview guide. The data analysis process utilised the ISO/IEC 29100:2011(E) standard on privacy as a means to assign thematic codes to the responses, which were then analysed. The proposed information privacy model was discussed in relation to the compliance requirements of the South African Protection of Personal Information (PoPI) Bill of 2009 and their application in a primary health care facility. The proposed information privacy model provides a holistic view of privacy management that can residually be used to increase awareness associated with the compliance requirements of using patient PII.
- Full Text:
- Date Issued: 2013
- Authors: Boucher, Duane Eric
- Date: 2013
- Subjects: Data protection , Privacy, Right of , Medical records -- Access control , Primary health care , Medical care , Caregivers , Community health nursing , Confidential communications , Information technology -- Management
- Language: English
- Type: Thesis , Masters , MCom (Information Systems)
- Identifier: vital:11139 , http://hdl.handle.net/10353/d1007181 , Data protection , Privacy, Right of , Medical records -- Access control , Primary health care , Medical care , Caregivers , Community health nursing , Confidential communications , Information technology -- Management
- Description: The revolutionary migration within the health care sector towards the digitisation of medical records for convenience or compliance touches on many concerns with respect to ensuring the security of patient personally identifiable information (PII). Foremost of these is that a patient’s right to privacy is not violated. To this end, it is necessary that health care practitioners have a clear understanding of the various constructs of privacy in order to ensure privacy compliance is maintained. This research project focuses on an investigation of privacy from a multidisciplinary philosophical perspective to highlight the constructs of information privacy. These constructs together with a discussion focused on the confidentiality and accessibility of medical records results in the development of an artefact represented in the format of a model. The formulation of the model is accomplished by making use of the Design Science research guidelines for artefact development. Part of the process required that the artefact be refined through the use of an Expert Review Process. This involved an iterative (three phase) process which required (seven) experts from the fields of privacy, information security, and health care to respond to semi-structured questions administered with an interview guide. The data analysis process utilised the ISO/IEC 29100:2011(E) standard on privacy as a means to assign thematic codes to the responses, which were then analysed. The proposed information privacy model was discussed in relation to the compliance requirements of the South African Protection of Personal Information (PoPI) Bill of 2009 and their application in a primary health care facility. The proposed information privacy model provides a holistic view of privacy management that can residually be used to increase awareness associated with the compliance requirements of using patient PII.
- Full Text:
- Date Issued: 2013
Distributed authentication for resource control
- Authors: Burdis, Keith Robert
- Date: 2000
- Subjects: Computers -- Access control , Data protection , Computer networks -- Security measures , Electronic data processing departments -- Security measures
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4630 , http://hdl.handle.net/10962/d1006512 , Computers -- Access control , Data protection , Computer networks -- Security measures , Electronic data processing departments -- Security measures
- Description: This thesis examines distributed authentication in the process of controlling computing resources. We investigate user sign-on and two of the main authentication technologies that can be used to control a resource through authentication and providing additional security services. The problems with the existing sign-on scenario are that users have too much credential information to manage and are prompted for this information too often. Single Sign-On (SSO) is a viable solution to this problem if physical procedures are introduced to minimise the risks associated with its use. The Generic Security Services API (GSS-API) provides security services in a manner in- dependent of the environment in which these security services are used, encapsulating security functionality and insulating users from changes in security technology. The un- derlying security functionality is provided by GSS-API mechanisms. We developed the Secure Remote Password GSS-API Mechanism (SRPGM) to provide a mechanism that has low infrastructure requirements, is password-based and does not require the use of long-term asymmetric keys. We provide implementations of the Java GSS-API bindings and the LIPKEY and SRPGM GSS-API mechanisms. The Secure Authentication and Security Layer (SASL) provides security to connection- based Internet protocols. After finding deficiencies in existing SASL mechanisms we de- veloped the Secure Remote Password SASL mechanism (SRP-SASL) that provides strong password-based authentication and countermeasures against known attacks, while still be- ing simple and easy to implement. We provide implementations of the Java SASL binding and several SASL mechanisms, including SRP-SASL.
- Full Text:
- Date Issued: 2000
- Authors: Burdis, Keith Robert
- Date: 2000
- Subjects: Computers -- Access control , Data protection , Computer networks -- Security measures , Electronic data processing departments -- Security measures
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4630 , http://hdl.handle.net/10962/d1006512 , Computers -- Access control , Data protection , Computer networks -- Security measures , Electronic data processing departments -- Security measures
- Description: This thesis examines distributed authentication in the process of controlling computing resources. We investigate user sign-on and two of the main authentication technologies that can be used to control a resource through authentication and providing additional security services. The problems with the existing sign-on scenario are that users have too much credential information to manage and are prompted for this information too often. Single Sign-On (SSO) is a viable solution to this problem if physical procedures are introduced to minimise the risks associated with its use. The Generic Security Services API (GSS-API) provides security services in a manner in- dependent of the environment in which these security services are used, encapsulating security functionality and insulating users from changes in security technology. The un- derlying security functionality is provided by GSS-API mechanisms. We developed the Secure Remote Password GSS-API Mechanism (SRPGM) to provide a mechanism that has low infrastructure requirements, is password-based and does not require the use of long-term asymmetric keys. We provide implementations of the Java GSS-API bindings and the LIPKEY and SRPGM GSS-API mechanisms. The Secure Authentication and Security Layer (SASL) provides security to connection- based Internet protocols. After finding deficiencies in existing SASL mechanisms we de- veloped the Secure Remote Password SASL mechanism (SRP-SASL) that provides strong password-based authentication and countermeasures against known attacks, while still be- ing simple and easy to implement. We provide implementations of the Java SASL binding and several SASL mechanisms, including SRP-SASL.
- Full Text:
- Date Issued: 2000
Trust on the semantic web
- Authors: Cloran, Russell Andrew
- Date: 2007 , 2006-08-07
- Subjects: Semantic Web , RDF (Document markup language) , XML (Document markup language) , Knowledge acquisition (Expert systems) , Data protection
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4649 , http://hdl.handle.net/10962/d1006616 , Semantic Web , RDF (Document markup language) , XML (Document markup language) , Knowledge acquisition (Expert systems) , Data protection
- Description: The Semantic Web is a vision to create a “web of knowledge”; an extension of the Web as we know it which will create an information space which will be usable by machines in very rich ways. The technologies which make up the Semantic Web allow machines to reason across information gathered from the Web, presenting only relevant results and inferences to the user. Users of the Web in its current form assess the credibility of the information they gather in a number of different ways. If processing happens without the user being able to check the source and credibility of each piece of information used in the processing, the user must be able to trust that the machine has used trustworthy information at each step of the processing. The machine should therefore be able to automatically assess the credibility of each piece of information it gathers from the Web. A case study on advanced checks for website credibility is presented, and the site presented in the case presented is found to be credible, despite failing many of the checks which are presented. A website with a backend based on RDF technologies is constructed. A better understanding of RDF technologies and good knowledge of the RAP and Redland RDF application frameworks is gained. The second aim of constructing the website was to gather information to be used for testing various trust metrics. The website did not gain widespread support, and therefore not enough data was gathered for this. Techniques for presenting RDF data to users were also developed during website development, and these are discussed. Experiences in gathering RDF data are presented next. A scutter was successfully developed, and the data smushed to create a database where uniquely identifiable objects were linked, even where gathered from different sources. Finally, the use of digital signature as a means of linking an author and content produced by that author is presented. RDF/XML canonicalisation is discussed in the provision of ideal cryptographic checking of RDF graphs, rather than simply checking at the document level. The notion of canonicalisation on the semantic, structural and syntactic levels is proposed. A combination of an existing canonicalisation algorithm and a restricted RDF/XML dialect is presented as a solution to the RDF/XML canonicalisation problem. We conclude that a trusted Semantic Web is possible, with buy in from publishing and consuming parties.
- Full Text:
- Date Issued: 2007
- Authors: Cloran, Russell Andrew
- Date: 2007 , 2006-08-07
- Subjects: Semantic Web , RDF (Document markup language) , XML (Document markup language) , Knowledge acquisition (Expert systems) , Data protection
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4649 , http://hdl.handle.net/10962/d1006616 , Semantic Web , RDF (Document markup language) , XML (Document markup language) , Knowledge acquisition (Expert systems) , Data protection
- Description: The Semantic Web is a vision to create a “web of knowledge”; an extension of the Web as we know it which will create an information space which will be usable by machines in very rich ways. The technologies which make up the Semantic Web allow machines to reason across information gathered from the Web, presenting only relevant results and inferences to the user. Users of the Web in its current form assess the credibility of the information they gather in a number of different ways. If processing happens without the user being able to check the source and credibility of each piece of information used in the processing, the user must be able to trust that the machine has used trustworthy information at each step of the processing. The machine should therefore be able to automatically assess the credibility of each piece of information it gathers from the Web. A case study on advanced checks for website credibility is presented, and the site presented in the case presented is found to be credible, despite failing many of the checks which are presented. A website with a backend based on RDF technologies is constructed. A better understanding of RDF technologies and good knowledge of the RAP and Redland RDF application frameworks is gained. The second aim of constructing the website was to gather information to be used for testing various trust metrics. The website did not gain widespread support, and therefore not enough data was gathered for this. Techniques for presenting RDF data to users were also developed during website development, and these are discussed. Experiences in gathering RDF data are presented next. A scutter was successfully developed, and the data smushed to create a database where uniquely identifiable objects were linked, even where gathered from different sources. Finally, the use of digital signature as a means of linking an author and content produced by that author is presented. RDF/XML canonicalisation is discussed in the provision of ideal cryptographic checking of RDF graphs, rather than simply checking at the document level. The notion of canonicalisation on the semantic, structural and syntactic levels is proposed. A combination of an existing canonicalisation algorithm and a restricted RDF/XML dialect is presented as a solution to the RDF/XML canonicalisation problem. We conclude that a trusted Semantic Web is possible, with buy in from publishing and consuming parties.
- Full Text:
- Date Issued: 2007
An investigation of ISO/IEC 27001 adoption in South Africa
- Authors: Coetzer, Christo
- Date: 2015
- Subjects: ISO 27001 Standard , Information technology -- Security measures , Computer security , Data protection
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4720 , http://hdl.handle.net/10962/d1018669
- Description: The research objective of this study is to investigate the low adoption of the ISO/IEC 27001 standard in South African organisations. This study does not differentiate between the ISO/IEC 27001:2005 and ISO/IEC 27001:2013 versions, as the focus is on adoption of the ISO/IEC 27001 standard. A survey-based research design was selected as the data collection method. The research instruments used in this study include a web-based questionnaire and in-person interviews with the participants. Based on the findings of this research, the organisations that participated in this study have an understanding of the ISO/IEC 27001 standard; however, fewer than a quarter of these have fully adopted the ISO/IEC 27001 standard. Furthermore, the main business objectives for organisations that have adopted the ISO/IEC 27001 standard were to ensure legal and regulatory compliance, and to fulfil client requirements. An Information Security Management System management guide based on the ISO/IEC 27001 Plan-Do-Check-Act model is developed to help organisations interested in the standard move towards ISO/IEC 27001 compliance.
- Full Text:
- Date Issued: 2015
- Authors: Coetzer, Christo
- Date: 2015
- Subjects: ISO 27001 Standard , Information technology -- Security measures , Computer security , Data protection
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4720 , http://hdl.handle.net/10962/d1018669
- Description: The research objective of this study is to investigate the low adoption of the ISO/IEC 27001 standard in South African organisations. This study does not differentiate between the ISO/IEC 27001:2005 and ISO/IEC 27001:2013 versions, as the focus is on adoption of the ISO/IEC 27001 standard. A survey-based research design was selected as the data collection method. The research instruments used in this study include a web-based questionnaire and in-person interviews with the participants. Based on the findings of this research, the organisations that participated in this study have an understanding of the ISO/IEC 27001 standard; however, fewer than a quarter of these have fully adopted the ISO/IEC 27001 standard. Furthermore, the main business objectives for organisations that have adopted the ISO/IEC 27001 standard were to ensure legal and regulatory compliance, and to fulfil client requirements. An Information Security Management System management guide based on the ISO/IEC 27001 Plan-Do-Check-Act model is developed to help organisations interested in the standard move towards ISO/IEC 27001 compliance.
- Full Text:
- Date Issued: 2015
A framework for information security management in local government
- Authors: De Lange, Joshua
- Date: 2017
- Subjects: Computer security -- Management , Data protection
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: http://hdl.handle.net/10948/7588 , vital:21932
- Description: Information has become so pervasive within enterprises and everyday life, that it is almost indispensable. This is clear as information has become core to the business operations of any enterprise. Information and communication technology (ICT) systems are heavily relied upon to store, process and transmit this valuable commodity. Due to its immense value, information and related ICT resources have to be adequately protected. This protection of information is commonly referred to as information security.
- Full Text:
- Date Issued: 2017
- Authors: De Lange, Joshua
- Date: 2017
- Subjects: Computer security -- Management , Data protection
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: http://hdl.handle.net/10948/7588 , vital:21932
- Description: Information has become so pervasive within enterprises and everyday life, that it is almost indispensable. This is clear as information has become core to the business operations of any enterprise. Information and communication technology (ICT) systems are heavily relied upon to store, process and transmit this valuable commodity. Due to its immense value, information and related ICT resources have to be adequately protected. This protection of information is commonly referred to as information security.
- Full Text:
- Date Issued: 2017
A model for information security control audit for small to mid-sized organisations
- Authors: Deysel, Natasha
- Date: 2009
- Subjects: Data protection , Computer networks -- Information technology
- Language: English
- Type: Thesis , Masters , MA
- Identifier: vital:9760 , http://hdl.handle.net/10948/940 , Data protection , Computer networks -- Information technology
- Description: Organisations are increasingly dependent on their information. Compromise to this information in terms of loss, inaccuracy or competitors gaining unauthorised access could have devastating consequences for the organisation. Therefore, information security governance has become a major concern for all organisations, large and small. Information security governance is based on a set of policies and internal controls by which organisations direct and manage their information security. An effective information security governance programme should be based on a recognised framework, such as the Control Objectives for Information and related Technology (COBIT). COBIT focuses on what control objectives must be achieved in order to effectively manage the information technology environment. It has become very clear that if a company is serious about information security governance, it needs to apply the COBIT framework that deals with information security. The problem in some medium-sized organisations is that they do not realise the importance of information security governance and are either unaware of the risks or choose to ignore these risks as they do not have the expertise or resources available to provide them with assurance that they have the right information security controls in place to protect their organisation against threats.
- Full Text:
- Date Issued: 2009
- Authors: Deysel, Natasha
- Date: 2009
- Subjects: Data protection , Computer networks -- Information technology
- Language: English
- Type: Thesis , Masters , MA
- Identifier: vital:9760 , http://hdl.handle.net/10948/940 , Data protection , Computer networks -- Information technology
- Description: Organisations are increasingly dependent on their information. Compromise to this information in terms of loss, inaccuracy or competitors gaining unauthorised access could have devastating consequences for the organisation. Therefore, information security governance has become a major concern for all organisations, large and small. Information security governance is based on a set of policies and internal controls by which organisations direct and manage their information security. An effective information security governance programme should be based on a recognised framework, such as the Control Objectives for Information and related Technology (COBIT). COBIT focuses on what control objectives must be achieved in order to effectively manage the information technology environment. It has become very clear that if a company is serious about information security governance, it needs to apply the COBIT framework that deals with information security. The problem in some medium-sized organisations is that they do not realise the importance of information security governance and are either unaware of the risks or choose to ignore these risks as they do not have the expertise or resources available to provide them with assurance that they have the right information security controls in place to protect their organisation against threats.
- Full Text:
- Date Issued: 2009
The cost of free instant messaging: an attack modelling perspective
- Authors: Du Preez, Riekert
- Date: 2006
- Subjects: Computer security , Instant messaging , Data protection
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9797 , http://hdl.handle.net/10948/499 , http://hdl.handle.net/10948/d1011921 , Computer security , Instant messaging , Data protection
- Description: Instant Messaging (IM) has grown tremendously over the last few years. Even though IM was originally developed as a social chat system, it has found a place in many companies, where it is being used as an essential business tool. However, many businesses rely on free IM and have not implemented a secure corporate IM solution. Most free IM clients were never intended for use in the workplace and, therefore, lack strong security features and administrative control. Consequently, free IM clients can provide attackers with an entry point for malicious code in an organization’s network that can ultimately lead to a company’s information assets being compromised. Therefore, even though free IM allows for better collaboration in the workplace, it comes at a cost, as the title of this dissertation suggests. This dissertation sets out to answer the question of how free IM can facilitate an attack on a company’s information assets. To answer the research question, the dissertation defines an IM attack model that models the ways in which an information system can be attacked when free IM is used within an organization. The IM attack model was created by categorising IM threats using the STRIDE threat classification scheme. The attacks that realize the categorised threats were then modelled using attack trees as the chosen attack modelling tool. Attack trees were chosen because of their ability to model the sequence of attacker actions during an attack. The author defined an enhanced graphical notation that was adopted for the attack trees used to create the IM attack model. The enhanced attack tree notation extends traditional attack trees to allow nodes in the trees to be of different classes and, therefore, allows attack trees to convey more information. During the process of defining the IM attack model, a number of experiments were conducted where IM vulnerabilities were exploited. Thereafter, a case study was constructed to document a simulated attack on an information system that involves the exploitation of IM vulnerabilities. The case study demonstrates how an attacker’s attack path relates to the IM attack model in a practical scenario. The IM attack model provides insight into how IM can facilitate an attack on a company’s information assets. The creation of the attack model for free IM lead to several realizations. The IM attack model revealed that even though the use of free IM clients may seem harmless, such IM clients can facilitate an attack on a company’s information assets. Furthermore, certain IM vulnerabilities may not pose a great risk by themselves, but when combined with the exploitation of other vulnerabilities, a much greater threat can be realized. These realizations hold true to what French playwright Jean Anouilh once said: “What you get free costs too much”.
- Full Text:
- Date Issued: 2006
- Authors: Du Preez, Riekert
- Date: 2006
- Subjects: Computer security , Instant messaging , Data protection
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9797 , http://hdl.handle.net/10948/499 , http://hdl.handle.net/10948/d1011921 , Computer security , Instant messaging , Data protection
- Description: Instant Messaging (IM) has grown tremendously over the last few years. Even though IM was originally developed as a social chat system, it has found a place in many companies, where it is being used as an essential business tool. However, many businesses rely on free IM and have not implemented a secure corporate IM solution. Most free IM clients were never intended for use in the workplace and, therefore, lack strong security features and administrative control. Consequently, free IM clients can provide attackers with an entry point for malicious code in an organization’s network that can ultimately lead to a company’s information assets being compromised. Therefore, even though free IM allows for better collaboration in the workplace, it comes at a cost, as the title of this dissertation suggests. This dissertation sets out to answer the question of how free IM can facilitate an attack on a company’s information assets. To answer the research question, the dissertation defines an IM attack model that models the ways in which an information system can be attacked when free IM is used within an organization. The IM attack model was created by categorising IM threats using the STRIDE threat classification scheme. The attacks that realize the categorised threats were then modelled using attack trees as the chosen attack modelling tool. Attack trees were chosen because of their ability to model the sequence of attacker actions during an attack. The author defined an enhanced graphical notation that was adopted for the attack trees used to create the IM attack model. The enhanced attack tree notation extends traditional attack trees to allow nodes in the trees to be of different classes and, therefore, allows attack trees to convey more information. During the process of defining the IM attack model, a number of experiments were conducted where IM vulnerabilities were exploited. Thereafter, a case study was constructed to document a simulated attack on an information system that involves the exploitation of IM vulnerabilities. The case study demonstrates how an attacker’s attack path relates to the IM attack model in a practical scenario. The IM attack model provides insight into how IM can facilitate an attack on a company’s information assets. The creation of the attack model for free IM lead to several realizations. The IM attack model revealed that even though the use of free IM clients may seem harmless, such IM clients can facilitate an attack on a company’s information assets. Furthermore, certain IM vulnerabilities may not pose a great risk by themselves, but when combined with the exploitation of other vulnerabilities, a much greater threat can be realized. These realizations hold true to what French playwright Jean Anouilh once said: “What you get free costs too much”.
- Full Text:
- Date Issued: 2006
An analysis of the use of DNS for malicious payload distribution
- Authors: Dube, Ishmael
- Date: 2019
- Subjects: Internet domain names , Computer networks -- Security measures , Computer security , Computer network protocols , Data protection
- Language: English
- Type: text , Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/97531 , vital:31447
- Description: The Domain Name System (DNS) protocol is a fundamental part of Internet activities that can be abused by cybercriminals to conduct malicious activities. Previous research has shown that cybercriminals use different methods, including the DNS protocol, to distribute malicious content, remain hidden and avoid detection from various technologies that are put in place to detect anomalies. This allows botnets and certain malware families to establish covert communication channels that can be used to send or receive data and also distribute malicious payloads using the DNS queries and responses. Cybercriminals use the DNS to breach highly protected networks, distribute malicious content, and exfiltrate sensitive information without being detected by security controls put in place by embedding certain strings in DNS packets. This research undertaking broadens this research field and fills in the existing research gap by extending the analysis of DNS being used as a payload distribution channel to detection of domains that are used to distribute different malicious payloads. This research undertaking analysed the use of the DNS in detecting domains and channels that are used for distributing malicious payloads. Passive DNS data which replicate DNS queries on name servers to detect anomalies in DNS queries was evaluated and analysed in order to detect malicious payloads. The research characterises the malicious payload distribution channels by analysing passive DNS traffic and modelling the DNS query and response patterns. The research found that it is possible to detect malicious payload distribution channels through the analysis of DNS TXT resource records.
- Full Text:
- Date Issued: 2019
- Authors: Dube, Ishmael
- Date: 2019
- Subjects: Internet domain names , Computer networks -- Security measures , Computer security , Computer network protocols , Data protection
- Language: English
- Type: text , Thesis , Masters , MSc
- Identifier: http://hdl.handle.net/10962/97531 , vital:31447
- Description: The Domain Name System (DNS) protocol is a fundamental part of Internet activities that can be abused by cybercriminals to conduct malicious activities. Previous research has shown that cybercriminals use different methods, including the DNS protocol, to distribute malicious content, remain hidden and avoid detection from various technologies that are put in place to detect anomalies. This allows botnets and certain malware families to establish covert communication channels that can be used to send or receive data and also distribute malicious payloads using the DNS queries and responses. Cybercriminals use the DNS to breach highly protected networks, distribute malicious content, and exfiltrate sensitive information without being detected by security controls put in place by embedding certain strings in DNS packets. This research undertaking broadens this research field and fills in the existing research gap by extending the analysis of DNS being used as a payload distribution channel to detection of domains that are used to distribute different malicious payloads. This research undertaking analysed the use of the DNS in detecting domains and channels that are used for distributing malicious payloads. Passive DNS data which replicate DNS queries on name servers to detect anomalies in DNS queries was evaluated and analysed in order to detect malicious payloads. The research characterises the malicious payload distribution channels by analysing passive DNS traffic and modelling the DNS query and response patterns. The research found that it is possible to detect malicious payload distribution channels through the analysis of DNS TXT resource records.
- Full Text:
- Date Issued: 2019
Implementing the CoSaWoE models in a commercial workflow product
- Authors: Erwee, Carmen
- Date: 2005
- Subjects: Computers -- Access control , Workflow , Computer security , Data protection
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9732 , http://hdl.handle.net/10948/169 , Computers -- Access control , Workflow , Computer security , Data protection
- Description: Workflow systems have gained popularity not only as a research topic, but also as a key component of Enterprize Resource Planning packages and e- business. Comprehensive workflow products that automate intra- as well inter-organizational information flow are now available for commercial use. Standardization efforts have centered mostly around the interoperability of these systems, however a standard access control model have yet to be adopted. The research community has developed several models for access control to be included as part of workflow functionality. Commercial systems, however, are still implementing access control functionality in a proprietary manner. This dissertation investigates whether a comprehensive model for gain- ing context-sensitive access control, namely CoSAWoE, can be purposefully implemented in a commercial workflow product. Using methods such as an exploratory prototype, various aspects of the model was implemented to gain an understanding of the di±culties developers face when attempting to map the model to existing proprietary software. Oracle Workflow was chosen as an example of a commercial workflow product. An investigtion of the features of this product, together with the prototype, revealed the ability to affect access control in a similar manner to the model: by specifying access control constraints during administration and design, and then enforcing those constraints dynamically during run-time. However, only certain components within these two aspects of the model directly effected the commercial workflow product. It was argued that the first two requirements of context-sensitive access control, order of events and strict least privilege, addressed by the object design, role engineering and session control components of the model, can be simulated if such capabilities are not pertinently available as part of the product. As such, guidelines were provided for how this can be achieved in Oracle Workflow. However, most of the implementation effort focussed on the last requirement of context-sensitive access control, namely separation of duties. The CoSAWoE model proposes SoD administration steps that includes expressing various business rules through a set of conflicting entities which are maintained outside the scope of the workflow system. This component was implemented easily enough through tables which were created with a relational database. Evaluating these conflicts during run-time to control worklist generation proved more di±cult. First, a thorough understanding of the way in which workflow history is maintained was necessary. A re-usable function was developed to prune user lists according to user involvement in previous tasks in the workflow and the conflicts specified for those users and tasks. However, due to the lack of a central access control service, this re- usable function must be included in the appropriate places in the workflow process model. Furthermore, the dissertation utilized a practical example to develop a prototype. This prototype served a dual purpose: firstly, to aid the author's understanding of the features and principles involved, and secondly, to illustrate and explore the implementation of the model as described in the previous paragraphs. In conclusion the dissertation summarized the CoSAWoE model's compo- nents which were found to be product agnostic, directly or indirectly imple- mentable, or not implemented in the chosen workflow product. The lessons learnt and issues surrounding the implementation effort were also discussed before further research in terms of XML documents as data containers for the workfow process were suggested.
- Full Text:
- Date Issued: 2005
- Authors: Erwee, Carmen
- Date: 2005
- Subjects: Computers -- Access control , Workflow , Computer security , Data protection
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9732 , http://hdl.handle.net/10948/169 , Computers -- Access control , Workflow , Computer security , Data protection
- Description: Workflow systems have gained popularity not only as a research topic, but also as a key component of Enterprize Resource Planning packages and e- business. Comprehensive workflow products that automate intra- as well inter-organizational information flow are now available for commercial use. Standardization efforts have centered mostly around the interoperability of these systems, however a standard access control model have yet to be adopted. The research community has developed several models for access control to be included as part of workflow functionality. Commercial systems, however, are still implementing access control functionality in a proprietary manner. This dissertation investigates whether a comprehensive model for gain- ing context-sensitive access control, namely CoSAWoE, can be purposefully implemented in a commercial workflow product. Using methods such as an exploratory prototype, various aspects of the model was implemented to gain an understanding of the di±culties developers face when attempting to map the model to existing proprietary software. Oracle Workflow was chosen as an example of a commercial workflow product. An investigtion of the features of this product, together with the prototype, revealed the ability to affect access control in a similar manner to the model: by specifying access control constraints during administration and design, and then enforcing those constraints dynamically during run-time. However, only certain components within these two aspects of the model directly effected the commercial workflow product. It was argued that the first two requirements of context-sensitive access control, order of events and strict least privilege, addressed by the object design, role engineering and session control components of the model, can be simulated if such capabilities are not pertinently available as part of the product. As such, guidelines were provided for how this can be achieved in Oracle Workflow. However, most of the implementation effort focussed on the last requirement of context-sensitive access control, namely separation of duties. The CoSAWoE model proposes SoD administration steps that includes expressing various business rules through a set of conflicting entities which are maintained outside the scope of the workflow system. This component was implemented easily enough through tables which were created with a relational database. Evaluating these conflicts during run-time to control worklist generation proved more di±cult. First, a thorough understanding of the way in which workflow history is maintained was necessary. A re-usable function was developed to prune user lists according to user involvement in previous tasks in the workflow and the conflicts specified for those users and tasks. However, due to the lack of a central access control service, this re- usable function must be included in the appropriate places in the workflow process model. Furthermore, the dissertation utilized a practical example to develop a prototype. This prototype served a dual purpose: firstly, to aid the author's understanding of the features and principles involved, and secondly, to illustrate and explore the implementation of the model as described in the previous paragraphs. In conclusion the dissertation summarized the CoSAWoE model's compo- nents which were found to be product agnostic, directly or indirectly imple- mentable, or not implemented in the chosen workflow product. The lessons learnt and issues surrounding the implementation effort were also discussed before further research in terms of XML documents as data containers for the workfow process were suggested.
- Full Text:
- Date Issued: 2005
Governing information security within the context of "bring your own device" in small, medium and micro enterprises
- Authors: Fani, Noluvuyo
- Date: 2017
- Subjects: Data protection , Computer security -- Management , Computer networks -- Security measures
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: http://hdl.handle.net/10948/7626 , vital:22114
- Description: Throughout history, information has been core to the communication, processing and storage of most tasks in the organisation, in this case in Small-Medium and Micro Enterprises (SMMEs). The implementation of these tasks relies on Information and Communication Technology (ICT). ICT is constantly evolving, and with each developed ICT, it becomes important that organisations adapt to the changing environment. Organisations need to adapt to the changing environment by incorporating innovative ICT that allows employees to perform their tasks with ease anywhere and anytime, whilst reducing the costs affiliated with the ICT. In this modern, performing tasks with ease anywhere and anytime requires that the employee is mobile whilst using the ICT. As a result, a relatively new phenomenon called “Bring Your Own Device” (BYOD) is currently infiltrating most organisations, where personally-owned mobile devices are used to access organisational information that will be used to conduct the various tasks of the organisation. The use of BYOD in organisations breeds the previously mentioned benefits such as performing organisational tasks anywhere and anytime. However, with the benefits highlighted for BYOD, organisations should be aware that there are risks to the implementation of BYOD. Therefore, the implementation of BYOD deems that organisations should implement BYOD with proper management thereof.
- Full Text:
- Date Issued: 2017
- Authors: Fani, Noluvuyo
- Date: 2017
- Subjects: Data protection , Computer security -- Management , Computer networks -- Security measures
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: http://hdl.handle.net/10948/7626 , vital:22114
- Description: Throughout history, information has been core to the communication, processing and storage of most tasks in the organisation, in this case in Small-Medium and Micro Enterprises (SMMEs). The implementation of these tasks relies on Information and Communication Technology (ICT). ICT is constantly evolving, and with each developed ICT, it becomes important that organisations adapt to the changing environment. Organisations need to adapt to the changing environment by incorporating innovative ICT that allows employees to perform their tasks with ease anywhere and anytime, whilst reducing the costs affiliated with the ICT. In this modern, performing tasks with ease anywhere and anytime requires that the employee is mobile whilst using the ICT. As a result, a relatively new phenomenon called “Bring Your Own Device” (BYOD) is currently infiltrating most organisations, where personally-owned mobile devices are used to access organisational information that will be used to conduct the various tasks of the organisation. The use of BYOD in organisations breeds the previously mentioned benefits such as performing organisational tasks anywhere and anytime. However, with the benefits highlighted for BYOD, organisations should be aware that there are risks to the implementation of BYOD. Therefore, the implementation of BYOD deems that organisations should implement BYOD with proper management thereof.
- Full Text:
- Date Issued: 2017
A study of South African computer users' password usage habits and attitude towards password security
- Authors: Friendman, Brandon
- Date: 2014
- Subjects: Computers -- Access control -- Passwords , Computer users -- Attitudes , Internet -- Access control , Internet -- Security measures , Internet -- Management , Data protection
- Language: English
- Type: text , Thesis , Masters , MSc
- Identifier: vital:4700
- Description: The challenge of having to create and remember a secure password for each user account has become a problem for many computer users and can lead to bad password management practices. Simpler and less secure passwords are often selected and are regularly reused across multiple user accounts. Computer users within corporations and institutions are subject to password policies, policies which require users to create passwords of a specified length and composition and change passwords regularly. These policies often prevent users from reusing previous selected passwords. Security vendors and professionals have sought to improve or even replace password authentication. Technologies such as multi-factor authentication and single sign-on have been developed to complement or even replace password authentication. The objective of the study was to investigate the password habits of South African computer and internet users. The aim was to assess their attitudes toward password security, to determine whether password policies a↵ect the manner in which they manage their passwords and to investigate their exposure to alternate authentication technologies. The results from the online survey demonstrated that password practices of the participants across their professional and personal contexts were generally insecure. Participants often used shorter, simpler and ultimately less secure passwords. Participants would try to memorise all of their passwords or reuse the same password on most of their accounts. Many participants had not received any security awareness training, and additional security technologies (such as multi-factor authentication or password managers) were seldom used or provided to them. The password policies encountered by the participants in their organisations did little towards encouraging the users to apply more secure password practices. Users lack the knowledge and understanding about password security as they had received little or no training pertaining to it.
- Full Text:
- Date Issued: 2014
- Authors: Friendman, Brandon
- Date: 2014
- Subjects: Computers -- Access control -- Passwords , Computer users -- Attitudes , Internet -- Access control , Internet -- Security measures , Internet -- Management , Data protection
- Language: English
- Type: text , Thesis , Masters , MSc
- Identifier: vital:4700
- Description: The challenge of having to create and remember a secure password for each user account has become a problem for many computer users and can lead to bad password management practices. Simpler and less secure passwords are often selected and are regularly reused across multiple user accounts. Computer users within corporations and institutions are subject to password policies, policies which require users to create passwords of a specified length and composition and change passwords regularly. These policies often prevent users from reusing previous selected passwords. Security vendors and professionals have sought to improve or even replace password authentication. Technologies such as multi-factor authentication and single sign-on have been developed to complement or even replace password authentication. The objective of the study was to investigate the password habits of South African computer and internet users. The aim was to assess their attitudes toward password security, to determine whether password policies a↵ect the manner in which they manage their passwords and to investigate their exposure to alternate authentication technologies. The results from the online survey demonstrated that password practices of the participants across their professional and personal contexts were generally insecure. Participants often used shorter, simpler and ultimately less secure passwords. Participants would try to memorise all of their passwords or reuse the same password on most of their accounts. Many participants had not received any security awareness training, and additional security technologies (such as multi-factor authentication or password managers) were seldom used or provided to them. The password policies encountered by the participants in their organisations did little towards encouraging the users to apply more secure password practices. Users lack the knowledge and understanding about password security as they had received little or no training pertaining to it.
- Full Text:
- Date Issued: 2014
A critical review of the IFIP TC11 Security Conference Series
- Authors: Gaadingwe, Tshepo Gaadingwe
- Date: 2007
- Subjects: Database security , Data protection , Computers -- Access control
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9795 , http://hdl.handle.net/10948/507 , Database security , Data protection , Computers -- Access control
- Description: Over the past few decades the field of computing has grown and evolved. In this time, information security research has experienced the same type of growth. The increase in importance and interest in information security research is reflected by the sheer number of research efforts being produced by different type of organizations around the world. One such organization is the International Federation for Information Processing (IFIP), more specifically the IFIP Technical Committee 11 (IFIP TC11). The IFIP TC11 community has had a rich history in producing high quality information security specific articles for over 20 years now. Therefore, IFIP TC11 found it necessary to reflect on this history, mainly to try and discover where it came from and where it may be going. Its 20th anniversary of its main conference presented an opportunity to begin such a study of its history. The core belief driving the study being that the future can only be realized and appreciated if the past is well understood. The main area of interest was to find out topics which may have had prevalence in the past or could be considered as "hot" topics. To achieve this, the author developed a systematic process for the study. The underpinning element being the creation of a classification scheme which was used to aid the analysis of the IFIP TC11 20 year's worth of articles. Major themes were identified and trends in the series highlighted. Further discussion and reflection on these trends were given. It was found that, not surprisingly, the series covered a wide variety of topics in the 20 years. However, it was discovered that there has been a notable move towards technically focused papers. Furthermore, topics such as business continuity had just about disappeared in the series while topics which are related to networking and cryptography continue to gain more prevalence.
- Full Text:
- Date Issued: 2007
- Authors: Gaadingwe, Tshepo Gaadingwe
- Date: 2007
- Subjects: Database security , Data protection , Computers -- Access control
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9795 , http://hdl.handle.net/10948/507 , Database security , Data protection , Computers -- Access control
- Description: Over the past few decades the field of computing has grown and evolved. In this time, information security research has experienced the same type of growth. The increase in importance and interest in information security research is reflected by the sheer number of research efforts being produced by different type of organizations around the world. One such organization is the International Federation for Information Processing (IFIP), more specifically the IFIP Technical Committee 11 (IFIP TC11). The IFIP TC11 community has had a rich history in producing high quality information security specific articles for over 20 years now. Therefore, IFIP TC11 found it necessary to reflect on this history, mainly to try and discover where it came from and where it may be going. Its 20th anniversary of its main conference presented an opportunity to begin such a study of its history. The core belief driving the study being that the future can only be realized and appreciated if the past is well understood. The main area of interest was to find out topics which may have had prevalence in the past or could be considered as "hot" topics. To achieve this, the author developed a systematic process for the study. The underpinning element being the creation of a classification scheme which was used to aid the analysis of the IFIP TC11 20 year's worth of articles. Major themes were identified and trends in the series highlighted. Further discussion and reflection on these trends were given. It was found that, not surprisingly, the series covered a wide variety of topics in the 20 years. However, it was discovered that there has been a notable move towards technically focused papers. Furthermore, topics such as business continuity had just about disappeared in the series while topics which are related to networking and cryptography continue to gain more prevalence.
- Full Text:
- Date Issued: 2007
Enabling e-learning 2.0 in information security education: a semantic web approach
- Authors: Goss, Ryan Gavin
- Date: 2009
- Subjects: Data protection , Computers -- Access control , Electronic data processing -- Security measures , Electronic data processing departments -- Security measures
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9771 , http://hdl.handle.net/10948/909 , Data protection , Computers -- Access control , Electronic data processing -- Security measures , Electronic data processing departments -- Security measures
- Description: The motivation for this study argued that current information security ed- ucation systems are inadequate for educating all users of computer systems world wide in acting securely during their operations with information sys- tems. There is, therefore, a pervasive need for information security knowledge in all aspects of modern life. E-Learning 2.0 could possi- bly contribute to solving this problem, however, little or no knowledge currently exists regarding the suitability and practicality of using such systems to infer information security knowledge to learners.
- Full Text:
- Date Issued: 2009
- Authors: Goss, Ryan Gavin
- Date: 2009
- Subjects: Data protection , Computers -- Access control , Electronic data processing -- Security measures , Electronic data processing departments -- Security measures
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9771 , http://hdl.handle.net/10948/909 , Data protection , Computers -- Access control , Electronic data processing -- Security measures , Electronic data processing departments -- Security measures
- Description: The motivation for this study argued that current information security ed- ucation systems are inadequate for educating all users of computer systems world wide in acting securely during their operations with information sys- tems. There is, therefore, a pervasive need for information security knowledge in all aspects of modern life. E-Learning 2.0 could possi- bly contribute to solving this problem, however, little or no knowledge currently exists regarding the suitability and practicality of using such systems to infer information security knowledge to learners.
- Full Text:
- Date Issued: 2009
A model for cultivating resistance to social engineering attacks
- Authors: Jansson, Kenny
- Date: 2011
- Subjects: Computer security , Data protection , Human-computer interaction
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9744 , http://hdl.handle.net/10948/1588 , Computer security , Data protection , Human-computer interaction
- Description: The human being is commonly considered as being the weakest link in information security. Subsequently, as information is one of the most critical assets in an organization today, it is essential that the human element is considered in deployments of information security countermeasures. However, the human element is often neglected in this regard. Consequently, many criminals are now targeting the user directly to obtain sensitive information instead of spending days or even months trying to hack through systems. Some criminals are targeting users by utilizing various social engineering techniques to deceive the user into disclosing information. For this reason, the users of the Internet and ICT-related technologies are nowadays very vulnerable to various social engineering attacks. As a contribution to increase users’ social engineering awareness, a model – called SERUM – was devised. SERUM aims to cultivate social engineering resistance within a community through exposing the users of the community to ‘fake’ social engineering attacks. The users that react incorrectly to these attacks are instantly notified and requested to participate in an online social engineering awareness program. Thus, users are educated on-demand. The model was implemented as a software system and was utilized to conduct a phishing exercise on all the students of the Nelson Mandela Metropolitan University. The aim of the phishing exercise was to determine whether SERUM is effective in cultivating social engineering resistant behaviour within a community. This phishing exercise proved to be successful and positive results emanated. This indicated that a model like SERUM can indeed be used to educate users regarding phishing attacks.
- Full Text:
- Date Issued: 2011
- Authors: Jansson, Kenny
- Date: 2011
- Subjects: Computer security , Data protection , Human-computer interaction
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9744 , http://hdl.handle.net/10948/1588 , Computer security , Data protection , Human-computer interaction
- Description: The human being is commonly considered as being the weakest link in information security. Subsequently, as information is one of the most critical assets in an organization today, it is essential that the human element is considered in deployments of information security countermeasures. However, the human element is often neglected in this regard. Consequently, many criminals are now targeting the user directly to obtain sensitive information instead of spending days or even months trying to hack through systems. Some criminals are targeting users by utilizing various social engineering techniques to deceive the user into disclosing information. For this reason, the users of the Internet and ICT-related technologies are nowadays very vulnerable to various social engineering attacks. As a contribution to increase users’ social engineering awareness, a model – called SERUM – was devised. SERUM aims to cultivate social engineering resistance within a community through exposing the users of the community to ‘fake’ social engineering attacks. The users that react incorrectly to these attacks are instantly notified and requested to participate in an online social engineering awareness program. Thus, users are educated on-demand. The model was implemented as a software system and was utilized to conduct a phishing exercise on all the students of the Nelson Mandela Metropolitan University. The aim of the phishing exercise was to determine whether SERUM is effective in cultivating social engineering resistant behaviour within a community. This phishing exercise proved to be successful and positive results emanated. This indicated that a model like SERUM can indeed be used to educate users regarding phishing attacks.
- Full Text:
- Date Issued: 2011
Managing an information security policy architecture : a technical documentation perspective
- Maninjwa, Prosecutor Mvikeli
- Authors: Maninjwa, Prosecutor Mvikeli
- Date: 2012
- Subjects: Computer security -- Management , Computer architecture , Data protection
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9825 , http://hdl.handle.net/10948/d1020757
- Description: Information and the related assets form critical business assets for most organizations. Organizations depend on their information assets to survive and to remain competitive. However, the organization’s information assets are faced with a number of internal and external threats, aimed at compromising the confidentiality, integrity and/or availability (CIA) of information assets. These threats can be of physical, technical, or operational nature. For an organization to successfully conduct its business operations, information assets should always be protected from these threats. The process of protecting information and its related assets, ensuring the CIA thereof, is referred to as information security. To be effective, information security should be viewed as critical to the overall success of the organization, and therefore be included as one of the organization’s Corporate Governance sub-functions, referred to as Information Security Governance. Information Security Governance is the strategic system for directing and controlling the organization’s information security initiatives. Directing is the process whereby management issues directives, giving a strategic direction for information security within an organization. Controlling is the process of ensuring that management directives are being adhered to within an organization. To be effective, Information Security Governance directing and controlling depend on the organization’s Information Security Policy Architecture. An Information Security Policy Architecture is a hierarchical representation of the various information security policies and related documentation that an organization has used. When directing, management directives should be issued in the form of an Information Security Policy Architecture, and controlling should ensure adherence to the Information Security Policy Architecture. However, this study noted that in both literature and organizational practices, Information Security Policy Architectures are not comprehensively addressed and adequately managed. Therefore, this study argues towards a more comprehensive Information Security Policy Architecture, and the proper management thereof.
- Full Text:
- Date Issued: 2012
- Authors: Maninjwa, Prosecutor Mvikeli
- Date: 2012
- Subjects: Computer security -- Management , Computer architecture , Data protection
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9825 , http://hdl.handle.net/10948/d1020757
- Description: Information and the related assets form critical business assets for most organizations. Organizations depend on their information assets to survive and to remain competitive. However, the organization’s information assets are faced with a number of internal and external threats, aimed at compromising the confidentiality, integrity and/or availability (CIA) of information assets. These threats can be of physical, technical, or operational nature. For an organization to successfully conduct its business operations, information assets should always be protected from these threats. The process of protecting information and its related assets, ensuring the CIA thereof, is referred to as information security. To be effective, information security should be viewed as critical to the overall success of the organization, and therefore be included as one of the organization’s Corporate Governance sub-functions, referred to as Information Security Governance. Information Security Governance is the strategic system for directing and controlling the organization’s information security initiatives. Directing is the process whereby management issues directives, giving a strategic direction for information security within an organization. Controlling is the process of ensuring that management directives are being adhered to within an organization. To be effective, Information Security Governance directing and controlling depend on the organization’s Information Security Policy Architecture. An Information Security Policy Architecture is a hierarchical representation of the various information security policies and related documentation that an organization has used. When directing, management directives should be issued in the form of an Information Security Policy Architecture, and controlling should ensure adherence to the Information Security Policy Architecture. However, this study noted that in both literature and organizational practices, Information Security Policy Architectures are not comprehensively addressed and adequately managed. Therefore, this study argues towards a more comprehensive Information Security Policy Architecture, and the proper management thereof.
- Full Text:
- Date Issued: 2012
Information security awareness: generic content, tools and techniques
- Authors: Mauwa, Hope
- Date: 2007
- Subjects: Computer security , Data protection , Computers -- Safety measures , Information technology -- Security measures
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9733 , http://hdl.handle.net/10948/560 , Computer security , Data protection , Computers -- Safety measures , Information technology -- Security measures
- Description: In today’s computing environment, awareness programmes play a much more important role in organizations’ complete information security programmes. Information security awareness programmes are there to change behaviour or reinforce good security practices, and provide a baseline of security knowledge for all information users. Security awareness is a learning process, which changes individual and organizational attitudes and perceptions so that the importance of security and the adverse consequences of its failure are realized. Therefore, with proper awareness, employees become the most effective layer in an organization’s security defence. With the important role that these awareness programmes play in organizations’ complete information security programmes, it is a must that all organizations that are serious about information security must implement it. But though awareness programmes have become increasing important, the level of awareness in most organizations is still low. It seems that the current approach of developing these programmes does not satisfy the needs of most organizations. Therefore, another approach, which tries to meet the needs of most organizations, is proposed in this project as part of the solution of raising the level of awareness programmes in organizations.
- Full Text:
- Date Issued: 2007
- Authors: Mauwa, Hope
- Date: 2007
- Subjects: Computer security , Data protection , Computers -- Safety measures , Information technology -- Security measures
- Language: English
- Type: Thesis , Masters , MTech
- Identifier: vital:9733 , http://hdl.handle.net/10948/560 , Computer security , Data protection , Computers -- Safety measures , Information technology -- Security measures
- Description: In today’s computing environment, awareness programmes play a much more important role in organizations’ complete information security programmes. Information security awareness programmes are there to change behaviour or reinforce good security practices, and provide a baseline of security knowledge for all information users. Security awareness is a learning process, which changes individual and organizational attitudes and perceptions so that the importance of security and the adverse consequences of its failure are realized. Therefore, with proper awareness, employees become the most effective layer in an organization’s security defence. With the important role that these awareness programmes play in organizations’ complete information security programmes, it is a must that all organizations that are serious about information security must implement it. But though awareness programmes have become increasing important, the level of awareness in most organizations is still low. It seems that the current approach of developing these programmes does not satisfy the needs of most organizations. Therefore, another approach, which tries to meet the needs of most organizations, is proposed in this project as part of the solution of raising the level of awareness programmes in organizations.
- Full Text:
- Date Issued: 2007
An investigation of issues of privacy, anonymity and multi-factor authentication in an open environment
- Authors: Miles, Shaun Graeme
- Date: 2012-06-20
- Subjects: Electronic data processing departments -- Security measures , Electronic data processing departments , Privacy, Right of , Computer security , Data protection , Computers -- Access control
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4656 , http://hdl.handle.net/10962/d1006653 , Electronic data processing departments -- Security measures , Electronic data processing departments , Privacy, Right of , Computer security , Data protection , Computers -- Access control
- Description: This thesis performs an investigation into issues concerning the broad area ofIdentity and Access Management, with a focus on open environments. Through literature research the issues of privacy, anonymity and access control are identified. The issue of privacy is an inherent problem due to the nature of the digital network environment. Information can be duplicated and modified regardless of the wishes and intentions ofthe owner of that information unless proper measures are taken to secure the environment. Once information is published or divulged on the network, there is very little way of controlling the subsequent usage of that information. To address this issue a model for privacy is presented that follows the user centric paradigm of meta-identity. The lack of anonymity, where security measures can be thwarted through the observation of the environment, is a concern for users and systems. By an attacker observing the communication channel and monitoring the interactions between users and systems over a long enough period of time, it is possible to infer knowledge about the users and systems. This knowledge is used to build an identity profile of potential victims to be used in subsequent attacks. To address the problem, mechanisms for providing an acceptable level of anonymity while maintaining adequate accountability (from a legal standpoint) are explored. In terms of access control, the inherent weakness of single factor authentication mechanisms is discussed. The typical mechanism is the user-name and password pair, which provides a single point of failure. By increasing the factors used in authentication, the amount of work required to compromise the system increases non-linearly. Within an open network, several aspects hinder wide scale adoption and use of multi-factor authentication schemes, such as token management and the impact on usability. The framework is developed from a Utopian point of view, with the aim of being applicable to many situations as opposed to a single specific domain. The framework incorporates multi-factor authentication over multiple paths using mobile phones and GSM networks, and explores the usefulness of such an approach. The models are in tum analysed, providing a discussion into the assumptions made and the problems faced by each model. , Adobe Acrobat Pro 9.5.1 , Adobe Acrobat 9.51 Paper Capture Plug-in
- Full Text:
- Authors: Miles, Shaun Graeme
- Date: 2012-06-20
- Subjects: Electronic data processing departments -- Security measures , Electronic data processing departments , Privacy, Right of , Computer security , Data protection , Computers -- Access control
- Language: English
- Type: Thesis , Masters , MSc
- Identifier: vital:4656 , http://hdl.handle.net/10962/d1006653 , Electronic data processing departments -- Security measures , Electronic data processing departments , Privacy, Right of , Computer security , Data protection , Computers -- Access control
- Description: This thesis performs an investigation into issues concerning the broad area ofIdentity and Access Management, with a focus on open environments. Through literature research the issues of privacy, anonymity and access control are identified. The issue of privacy is an inherent problem due to the nature of the digital network environment. Information can be duplicated and modified regardless of the wishes and intentions ofthe owner of that information unless proper measures are taken to secure the environment. Once information is published or divulged on the network, there is very little way of controlling the subsequent usage of that information. To address this issue a model for privacy is presented that follows the user centric paradigm of meta-identity. The lack of anonymity, where security measures can be thwarted through the observation of the environment, is a concern for users and systems. By an attacker observing the communication channel and monitoring the interactions between users and systems over a long enough period of time, it is possible to infer knowledge about the users and systems. This knowledge is used to build an identity profile of potential victims to be used in subsequent attacks. To address the problem, mechanisms for providing an acceptable level of anonymity while maintaining adequate accountability (from a legal standpoint) are explored. In terms of access control, the inherent weakness of single factor authentication mechanisms is discussed. The typical mechanism is the user-name and password pair, which provides a single point of failure. By increasing the factors used in authentication, the amount of work required to compromise the system increases non-linearly. Within an open network, several aspects hinder wide scale adoption and use of multi-factor authentication schemes, such as token management and the impact on usability. The framework is developed from a Utopian point of view, with the aim of being applicable to many situations as opposed to a single specific domain. The framework incorporates multi-factor authentication over multiple paths using mobile phones and GSM networks, and explores the usefulness of such an approach. The models are in tum analysed, providing a discussion into the assumptions made and the problems faced by each model. , Adobe Acrobat Pro 9.5.1 , Adobe Acrobat 9.51 Paper Capture Plug-in
- Full Text: