1 00:00:01,980 --> 00:00:06,020 The next category and the model is denial of service. 2 00:00:06,220 --> 00:00:08,030 This one is the hardest to tackle. 3 00:00:09,640 --> 00:00:16,380 Each widely available service can be used for purposes other than intended access to a web server can 4 00:00:16,380 --> 00:00:20,640 be blocked by a large amount of incoming data. 5 00:00:20,690 --> 00:00:25,400 You can overload a system or an application instructing them to perform absurd operations 6 00:00:28,420 --> 00:00:33,570 to overload a computer you can use the so-called avalanche effect. 7 00:00:33,590 --> 00:00:39,440 This means using multiple machines to flood the bandwidth of the target computer through standard communication 8 00:00:39,440 --> 00:00:40,090 attempts 9 00:00:42,740 --> 00:00:49,950 systems with partially implemented security policy are prone to the elevation of privilege attacks in 10 00:00:49,950 --> 00:00:55,160 some systems each user has the ability to make system wide changes. 11 00:00:56,330 --> 00:01:05,650 In others regular users have limited privileges a regular user cannot perform certain operations these 12 00:01:05,650 --> 00:01:12,320 operations include installing new software establishing connections and changing system configuration. 13 00:01:13,430 --> 00:01:21,250 All of which could be useful to the attackers This in turn requires them to gain elevated privileges 14 00:01:22,530 --> 00:01:28,250 the stress mode can be not only applied to computer systems that has computers and servers. 15 00:01:28,250 --> 00:01:34,900 It can also be used for threat modeling of database servers. 16 00:01:34,910 --> 00:01:40,920 You can see an example of such an application in the last slide. 17 00:01:41,020 --> 00:01:45,750 It shows that weak passwords make it easier for the attackers to impersonate a trusted user. 18 00:01:47,910 --> 00:01:56,340 If we identify such a threat we have to think how to eliminate or at least limited one of the methods 19 00:01:56,370 --> 00:02:00,680 is to implement a system of passwords. 20 00:02:00,690 --> 00:02:06,870 Let's see how this could be achieved if we didn't have data about the user's activity. 21 00:02:06,870 --> 00:02:13,830 Let's say you manage a database application that doesn't record operations performed by users. 22 00:02:13,840 --> 00:02:16,690 That's why you don't know who introduced the new offer. 23 00:02:16,720 --> 00:02:22,660 Change the price of an existing offer or delete the key client from the database. 24 00:02:22,660 --> 00:02:24,930 This is a threat to the confidentiality. 25 00:02:25,090 --> 00:02:33,470 Consistency and integrity of your data having classified the threat as repudiation. 26 00:02:33,480 --> 00:02:35,530 Think of the best way to eliminate it. 27 00:02:36,640 --> 00:02:39,390 Because the problem concerns a database. 28 00:02:39,460 --> 00:02:44,250 The best answer is user activity monitoring tools. 29 00:02:44,400 --> 00:02:51,510 Depending on the database type These can include database triggers trace files or database auditing 30 00:02:53,100 --> 00:02:58,910 in the diagram you can see that database resources include clients personal data. 31 00:02:59,150 --> 00:03:05,690 We treat them as confidential and we try to describe the threat to this resource. 32 00:03:05,720 --> 00:03:13,250 It seems reasonable that the resource is vulnerable to the threat called information disclosure having 33 00:03:13,250 --> 00:03:14,440 identified the threat. 34 00:03:14,450 --> 00:03:20,440 We try to find a technological or non technological solution that would help us limit or eliminate it. 35 00:03:22,750 --> 00:03:29,380 The solution may include data encryption or some other access control mechanism we trust. 36 00:03:29,390 --> 00:03:34,730 I hope that on the basis of the analysis of the model database application and diagrams shown in the 37 00:03:34,730 --> 00:03:41,090 slides you'll be able to perform a system analysis on your own. 38 00:03:41,090 --> 00:03:44,450 This would be the first step to the development of security policy. 39 00:03:48,330 --> 00:03:49,680 Thank you for your attention.