NCSA 1997 Computer Virus Prevalence Survey
NCSA®
National Computer Security Association


Cheyenne Software
Command Software Systems
Deloitte & Touche, LLP
Dr. Solomon Software, Int'l.
INTEL
McAfee Associates
Microsoft Corporation
ON Technology
Quarterdeck Corporation
Symantec Corporation
Trend Micro Inc.


Table of Contents

Figures

Figure 1. Infections Per 1,000 Computers Per Month, 1996-1997
Figure 2. Likelihood of a Virus Encounter.
Figure 3. Infections per Month per 1,000 Computers, Top Ten Viruses
Figure 4. Relative Dominance of Top Ten Viruses, 1997
Figure 5. Dominance of Top Ten Viruses, 1996
Figure 6. Infection Sources
Figure 7. Sources of Infection, Boot and Macro Viruses, 1997.
Figure 8. Effects of Viruses, 1996-1997
Figure 9. Relative Prevalence of Top Ten Viruses
Figure 10. Length of Time to Completely Recover from Most Recent Incident, 1996-1997
Figure 11. Person-Days Most Recent Incident Cost the Group, 1996-1997
Figure 12. Stated Cost of Virus Incident, 1996-1997.

Tables

Table 1. Respondent’s Opinion of State of the Virus Problem in the Computing Industry Compared to This Time Last Year
Table 2. Respondent’s Opinion of State of the Macro Virus Problem in the Computing Industry Compared to This Time Last Year
Table 3. Percent of Sites Experiencing a Virus Encounter.
Table 4. Infections Per 1,000 Computers Per Month
Table 5. Month of Most Recent Disaster
Table 6. Cost Comparison of Disasters, 1996-1997
Table 7. Which viruses have affected your group's PCs during 1997?
Table 8. Titles of Respondents
Table 9. Respondents Department, 1996 and 1997
Table 10. Percentage of Organizations Running Various Operating Systems
Table 11. Percent of Organizations Running Various Operating Systems
Table 12. Primary Line of Business.
Table 13. Encounters Per 1,000 Computers Per Month, 1996-1997 Studies.
Table 14. Infections per Month per 1,000 Computers, Top Ten Viruses
Table 15. Reality Check on Encounter Rate.
Table 16. Other Viruses Reported, 1997.
Table 17. Sources of Infection, 1996-1997
Table 18. Sources of Infection, Boot and Macro Viruses, 1997.
Table 19. Means of Infection Summary, 1996 Survey
Table 20. Effects of Viruses, 1996-1997
Table 21. Most Commonly Found: Percent of Organizations Infected
Table 22. Viruses Most Commonly Found: Total Infections
Table 23. PCs and Servers Suspected/Actually Infected During Most Recent Incident, 1996-1997
Table 24. Desktop Virus Protection Methods Used
Table 25. Number of Desktop Protection Methods Used
Table 26. Effectiveness in Virus Prevention, Desktop Encounters 1997
Table 27. Effectiveness in Averting Disasters
Table 28. Server Virus Protection Methods Used
Table 29. Percent of Servers Running Either Periodic or Full-Time Scans, or Both
Table 30. E-mail, proxy servers, and firewalls with virus protection

I. Objectives

The objective of this project is to identify the nature and extent of the computer virus problem in PC-type computers and networks. The scope of the survey includes:

The results of this research are being used by the National Computer Security Association (NCSA) to increase the public awareness of the extent of the computer virus problem.

II. Research Methodology

Confidence -- To meet the objectives of the survey, telephone interviews were completed with 300 end-users. This sample size provides an accuracy of plus or minus 5.6% with 95 percent confidence for questions that relate to the entire sample. Internal consistency checking (where similar data was arrived at by different means and different questions) suggest that respondent-estimation errors may be as large as 50% in some cases.

Selection -- Respondents for the survey were randomly selected from Computer Intelligence lists of sites with 500 or more PCs at that site. The sample included all service and industry DIC (Standard Industry Code) codes, as well as federal, site, and local government. Educational sites were excluded from the survey.

Interview target -- The interviews were conducted with the individual most responsible for managing virus problems on PC’s or networks for the organization or site. The person was typically found through referral and an average of six calls were required to complete an interview. Only individuals who had responsibility for 200 or more PCs in terms of virus knowledge, prevention and software, were qualified as respondents to the survey. Respondents were ensured confidentiality to maximize their responsiveness and to enhance the overall credibility of the survey.

Trained interviewers -- All interviewing was conducted by trained interviewers dialing from a centrally supervised and monitored facility at Paria Group Market Research Orem, Utah. Interviewing was accomplished during March 7 through March 26, 1996 during the hours of 9am through 5pm Mountain Standard time. The interviewers involved in the survey are dedicated to surveys of vendors, dealers, and end-users for computer and networking industries. All responses were captured using an on-line survey system for direct tabulation and analysis.

Rounding -- Sometimes percentages will total more than 100 percent due to questions allowing for multiple responses. In some cases, charts or graphs will be less than 100 percent due to "Don’t Know" answers, refusals or "Other" responses not being included. Also, in a few instances, rows or columns may total either 99 percent or 101 percent due to rounding.

A copy of the questionnaire is included in Appendix A . Final disposition of dialing attempts and statistics for the execution of the survey are included in Appendix B.

Previous work – Some of the results of this survey can be directly compared with two previous surveys:

  1. A previous survey conducted for NCSA by Network Associates during March, 1996. The 1996 survey had a nearly identical design, and nearly identical demographics. Where appropriate, results from this survey will be compared with the 1996 survey.
  2. A previous survey conducted for NCSA by Dataquest during October, 1991. The 1991 NCSA-Dataquest survey had very similar design, with very similar demographics. The primary differences are that the 1991 survey focused on organizations with a smaller number of PCs (at least 300 vs. at least 500 for this survey) and had a smaller average number of PCs per respondent (1,027 then vs. 2,162 now). The larger cut-off for inclusion was used for this survey to partially account for PC market growth over the intervening time period. The total sample of potential survey sites (North American organizations with more than 300 or 500 PCs respectively), were about the same for both surveys (2,300 vs. 2250 potential sites). Where appropriate, results from this survey will be compared with the 1991 survey.

Biases -- There are several potential biases to this study:

III. Executive Summary

The computer virus problem in North America is still large and is continuing to grow. Virtually all medium and large organizations in North America (99.33%) have experienced at least one computer virus infection first hand. This year's survey reports that while the usage of anti-virus software is up, 73% of machines versus 60% in 1996, the infection rate grows. Of the 300 respondents representing 728,798 desktop computers and 24, 270 servers, the infection rate is about 33 of 1000 machines infected in any given month and 406 of 1000 machines infected in a given year. This is up from 1996 when the chance of experiencing a computer virus encounter or incident was about ten out of 1000 PCs per month.

The most common virus since the Fall of 1995 is the Word.concept virus. The Word.concept virus continues grow more rapidly than any previous virus, infecting 49% of all survey sites. How prevalent is Word.concept? In the first half of 1996 there were 11,481 reported incidents and in the second half there were 13,662 reported incidents. In the first two months of 1997 alone there were 10,750 reports of Word.concept. At this rate the Word.concept macro virus could triple in incidents during 1997. While Word.concept far outdistanced other viruses in total numbers of incidents, a look at the "top ten" reported viruses brings more troubling news: four of those are macro viruses. One of the reasons that Word.concept and other macro viruses are the most prevalent, is because they travel with document files. This gives them the capability of traveling both via floppy diskettes and also across networks and as e-mail attachments that their existence on most computers is not noticed by the average end-user; they silently continue to replicate to more and more places.

As we discovered in last year's survey, users can effectively protect themselves from all viruses which have ever been shown to infect any computer by simply installing a quality anti-virus product. Keeping the anti-virus software active and operational and by updating the software regularly. It has been shown that if as few as 30% of the world’s PCs used a relatively current, full-time anti-virus protection method, that the effect of "herd immunity" would nearly eliminate the world-wide computer virus problem.

IV. Principal Findings

Demographics – 1997: The average survey site had 2,454 PCs and 81 file and application servers. The total number of PCs represented by the 300 surveyed sites was approximately 728,798. The total number of file and application servers was approximately 24,270. The great majority of the surveyed organizations appear to have at least one individual with expertise in computer viruses.

Pervasive problem – 1997: Virtually all (99.33%) medium and large organizations in North America have had at least one computer virus encounter. Of the 300 sites interviewed, only two (0.67%) claimed the organization never encountered a virus. Over two thirds (69%) of responding organizations said they have procedures to formally track computer virus problems. The average person being interviewed thought he or she would be informed or know about 82% of the virus encounters which occurred at his organization or group.

Opinion, getting worse – Nearly 43% of respondents thought that computer virus problems had gotten worse this year, while only 19% felt things were getting better. Compared with the results from 1996, this shows a slight shift away from negative opinions: In 1996, 50% felt things were getting worse, and only 10% felt things were getting better.

Table 1. Respondent’s Opinion of State of the Virus Problem in the Computing Industry Compared to This Time Last Year

1996

1997

Much worse

20.0%

19.1%

Somewhat worse

30.0%

23.5%

About the same

40.0%

36.9%

Somewhat better

6.7%

12.4%

Much better

3.0%

6.7%

Don’t know

0.3%

1.3%

Refused

0

0

On the other hand, respondents showed great despair over the Word Macro virus problem, with half feeling that things were worse and only 13% feeling things were getting better. (See Table 2, below)

Table 2. Respondent’s Opinion of State of the Macro Virus Problem in the Computing Industry Compared to This Time Last Year

Word Macro Problem

Excel Macro Problem

Much worse

31.9%

8.4%

Somewhat worse

18.1%

11.7%

About the same

24.2%

43.6%

Somewhat better

12.1%

8.7%

Much better

4.7%

2.7%

Don’t know

8.7%

24.2%

Refused

<1%

<1%

If the results of this survey are examined alone (the "March, 1997" column of Table 2, below), virus incidents or encounters (an event where viruses were discovered on any PCs, diskettes, or files) appear to be increasing:

However, comparing the 1996 and 1997 survey results shows a surprising shift. In both surveys, respondents felt things were getting worse. But our respondents in 1997 reported a relatively better February, a relatively better January, and relatively worse previous years when compared with 1996 respondents. [Table 3]. In addition, 1997 respondents were not quite as negative in their overall opinion of the state of the virus problem as respondents in 1996. [Table 1]

Table 3. Percent of Sites Experiencing a Virus Encounter.

Time Period

March, 1996

March, 1997

February, this year (1996/7)

90%

87.3%

January, this year (1996/7)

83%

85.2%

Second half of previous year (1995/6)

71%

97.5%

First half of previous year (1995/6)

63%

93.8%

All of prior year (1994/5)

21%

93.9%

These numbers suggest that the percentages of sites having that are infected in a given month may be going down. However, Table 4, suggests that the number of infected machines is increasing substantially.

Virus encounters – 1997: What is the likelihood that a computer is infected with a virus? Our best estimate of this comes from responses to the question "How many virus encounters did you have during February, 1997", as this question is least biased by retrospection in our survey. Our 300 respondents had an average of 85 incidents each for the month, or about 25,500 incidents. Across the 728,798 computers and 24,270 file servers represented in the study, the infection rate is about 33 out of 1,000 machines infected in any given month, 406 in 1000 infected in a given year. These numbers are more than double the rates of the 1996 survey.

We can compare the numbers for January, 1996 and January, 1997, as well, as a check on the difference between 1996 and 1997, since both should be affected by comparable levels of "retrospective bias." It turns out that respondents in 1997 actually reported more incidents for January than February, whereas respondents in 1996 reported far fewer for January than February. The results from Table 3 are graphed in Figure 1.

Table 4. Infections Per 1,000 Computers Per Month

Date Number
January, 1996 6.1
February, 1996 14.4
January, 1997 35.21
February, 1997 33.86

Figure 1. Infections Per 1,000 Computers Per Month, 1996-1997

Virus Likelihood -- It appears that the likelihood of an organization encountering a computer virus is an order of magnitude higher in early 1997 than in early 1997 and that the current likelihood is on the order of 35 virus encounters per 1,000 PCs per month. There is little doubt that the virus problem is, indeed growing, and at a considerable rate.

Disasters -- 1997: About a third (33.89%) of all sites experienced a computer virus "disaster" (defined by the survey as a virus encounter where a minimum of 25 PCs, diskettes, or files were infected by the same virus at approximately the same time) in the preceding 14 months. It appears that virus disasters per year per organization are about as likely in 1997 as in 1996.

Respondents were asked "When was the month and year of your most recent disaster?" Most disasters were fairly recent, as may be seen from Table 5, which shows the distribution of responses from those reporting a disaster in the past 14 months. About one half of all disasters occurred in the last four months.

Table 5. Month of Most Recent Disaster

Month

Percentage

Jan-96

2.2%

Feb-96

3.2%

Mar-96

5.4%

Apr-96

2.2%

May-96

4.3%

Jun-96

3.2%

Jul-96

1.1%

Aug-96

7.5%

Sep-96

5.4%

Oct-96

6.5%

Nov-96

8.6%

Dec-96

6.5%

Jan-97

14.0%

Feb-97

22.6%

Mar-97

7.5%

Incident costs – 1997: For the 34% of sites which experienced virus disasters, servers were down for an average of .66 hours – roughly 40 minutes. Most servers were not downed at all, and the longest downtime was 24 hours. This reduction in downtime of servers may be the effect of organizations learning that downing a server is often neither necessary nor helpful in removing most viruses. Complete recovery took an average of 44 hours, 21.7 person-days of work, and an average of $8,366 in self-proclaimed costs.

The results for 1997 and 1996 differ a bit, as may be seen in Table 6. In 1997, servers were down a much shorter period of time; recovery took the same number of days, but with twice as many people affected. Costs remained about the same, adjusting for inflation.

Table 6. Cost Comparison of Disasters, 1996-1997

Cost 1996 1997
Server Downtime 5.8 hours 40 minutes
Time to Recovery 44 hours 44 hours
Person Days Lost 10 22
Financial Cost $8,100 $8,366

Word macro virus -- 1997: During 1997 one virus, the Word.Concept virus (also known as WM.Concept and Prank), infected one-half (49%) of survey sites. This year’s survey witnessed the remarkable growth of this family, with macro viruses of all sorts accounting for 80% of infections reported.

Table 7. Which viruses have affected your group's PCs during 1997?

Virus

Percent Reporting

WelcomB

1.68%

Word Macro

15.10%

Word Macro Concept

34.23%

Word Macro Npad

4.36%

Word Macro Wazzu

19.13%

Word Macro MDMA

2.68%

Word Macro Colors

2.01%

Excel Macro

0.67%

XM Laroux

1.01%

Other (SPECIFY)

25.17%

None

5.37%

Don't know

8.72%

Refused

0.34%

This is remarkable for several reasons. 1) The Concept virus was first distributed in July, 1995. Therefore, its growth rate is very high compared with any previous virus. 2) Macro viruses represent a new class of very "successful" viruses (macro-type viruses) in that they infect neither compiled executable, program files (file-type viruses), nor the reserved low-level areas of hard and floppy disks (boot-track viruses), but, instead infect document files to include "macro code" which is interpreted and executed by application software (most often, Microsoft Word).

Source of virus incidents – 1997: The most common means of infection was via diskettes which apparently were responsible for the introduction of viruses in six out of ten infections. Seven percent of respondents did not know the source of their most recent virus incident or encounter. Strikingly, nearly 45% of respondents thought that their most recent virus incident began with either a download (19%) or as an e-mail attachment (26%). This is more than double the rate in the 1992 survey, which doubled the 1991 survey and is almost certainly due to the growth of macro viruses since boot-track viruses (which represent more than 95% of the remaining virus encounters) can not practically be either downloaded or passed as e-mail attachments (they do not travel as files but, rather, as hard-coded sectors of disks).

Effects of computer viruses –The major effect of virus encounters and incidents were related to loss of user productivity including users loss of access to PCs and or data. Lost or corrupted data, unreliable applications, screen messages, interference, lockup and system crashes were also commonly reported outcomes. Interview respondents were generally good at estimating ahead of time the number of desktop PCs likely to be involved a particular virus incident, but feared more servers were infected than proved to be the case. (In the most recent virus incident in 1997, an average of 7.6 servers were suspected of being infected, and 1.8 actually infected; in 1996, an average of 1.6 servers were suspected of being infected, and 5.4 actually infected). [Table 23. PCs and Servers Suspected/Actually Infected During Most Recent Incident, 1996-1997]

Use of Anti-Virus products – Virtually all sites used anti-virus products on desktop PCs, and claim to have some form of protection for about 73% of all PCs -- up from 60% in last year’s survey. [see D. Usage of Anti-Virus Products]

Of these:

Approximately 85% of sites used anti-virus products on servers and about 64% of all servers appear to have some protection installed. About 40% of protected servers are protected by "Periodic Only" scans – up from 28% in 1996 – and about 32% run "Full-Time Only" scans -- down from 36% in 1996. About 28% of protected servers use both periodic and full-time background scanning. [Table 29. Percent of Servers Running Either Periodic or Full-Time Scans, or Both]

V. Detailed Findings

A. Profile of Respondents

Each site contacted for this survey was asked who in the company was the most responsible for managing virus problems on PC’s or networks in their company. Once a qualified respondent was located, these individuals were asked about the number of PCs and servers they are responsible for (in terms of viruses), their department, job title, and the primary line of business in which their company is engaged.

Job Title – There was considerable variation in the job titles of respondents, with 182 titles identified for this group of 300 respondents.

Table 8. Titles of Respondents

Title

Percent

Director of Data Processing/Information System

3.67%

Director of Computer Security

5.00%

Manager of Data Processing

1.33%

Manager of MIS/IS

8.67%

Manager of Computer Security

3.00%

Systems Anaylst/Programmer

7.67%

Support Specialist

4.67%

Systems Manager/PC Network/LAN Manager

20.00%

Other

46.00%

Respondent’s Department Responses on the department of the respondents is shown in the table below. Fully 87% of those surveyed were from MIS / IS departments.

Table 9. Respondents Department, 1996 and 1997

1996

1997

MIS / IS

87%

73%

Customer Service / Support

3%

<1%

Data Processing

3%

4%

Public Relations / Communications

2%

0%

General Administration / Management

1%

<1%

Accounting / Finance

1%

1%

Engineering

1%

<1%

Education / Training

<1%

0%

Manufacturing / Production

<1%

0%

Research & Development

<1%

<1%

Personnel / Human Resources

<1%

<1%

Sales / Marketing

<1%

0%

Number of PCs in the Group – Respondents were asked how many PCs they were responsible for in terms of virus knowledge, prevention, and software. It was recognized that the respondent might not have complete responsibility for these PCs, but would be able to talk in detail about the virus problems encountered on these PCs and the virus prevention software and techniques employed. Respondents who represented multi-national organizations were asked to limit their discussion to those PCs and servers in North America. These PCs were subsequently referred to as the PCs in their group. The average respondent was responsible for 2,454 PCs.

Desktop PC Operating System – Respondents were asked both which desktop operating systems were used in their organization (group), and also were asked to estimate the number of PCs using each operating system. . A total of 728,798 PCs were represented by this survey. Although the number of Macintosh systems was solicited at this stage of the survey, the respondent was instructed to ignore virus related issues on Macintosh systems for the remainder of the survey. The breakdown of PC operating systems is shown in table 9.

Table 10. Percentage of Organizations Running Various Operating Systems

Operating System

In Use in Company

DOS only, no Windows

45.00%

Windows 3.1

90.67%

Windows 95

88.67%

Windows NT

72.67%

OS/2

38.67%

Macintosh

43.00%

Unix

47.33%

Other

4.67%

Type of Network Employed – Respondents were asked which network operating systems were in use in their group as well as the total number of servers (both file servers and application servers) using each type of operating system. The average survey site had 81 file and application servers; the total number of file and application servers was approximately 24,270. Respondents were asked Which LAN server operating systems does your organization have running? The breakdown of network operating systems is shown below.

Table 11. Percent of Organizations Running Various Operating Systems

NOS

Percent of Organizations Running

Novell NetWare 3.x

61.00%

Novell NetWare 4.x

60.67%

Windows NT

71.00%

IBM OS/2 LAN Server

14.33%

IBM LAN Server

3.33%

Banyan Vines

10.67%

Unix

42.33%

Primary Line of Business – As shown in the table below, Primary Line of Business, respondents represented a range of business types, with no dominant representation among Finance/Insurance/Real Estate/Banking/Investment Firms, Government offices, Manufacturing, Transportation/ Public Utilities, and Service Businesses.

Table 12. Primary Line of Business.

Business

Percent

Accounting

0.67%

Business Services

0.33%

Communications

1.67%

Construction

0.33%

Education

1.67%

Engineering

1.33%

Finance/Insurance

13.33%

Government

24.67%

Healthcare

16.67%

Legal

2.00%

Manufacturing

11.00%

Non-Profit

0.67%

Professional Services

0.67%

Retail

1.33%

Transportation/Utilities

6.34%

Other

17.33%

B. Virus Encounters

Overall Level of Virus Incidents – Overall, 99.3% of respondents discovered a virus at sometime on a PC, file or diskette in their organization. As expected, larger groups were much more likely to have encountered a virus during any given time period (as they have proportionally larger number of PCs to get infected.) The group of 300 organizations had a total of 145,753 encounters in the 3.2 years in question on the 728,798 machines represented, or 62.5 encounters per 1,000 machines per year over the 3.2 year period.

Table 13 compares last year’s study with this year’s. In general, encounters have been steadily increasing, with reasonable correspondence in the two studies. We have plotted these values in Figure 2. There is probably some retrospective bias at work here, but if comparable intervals are compared (eg., Feb ’97 in the 1997 study with Feb ’96 in the 1996 study), we see that the probability of an encounter has approximately doubled in the past year.

Table 13. Encounters Per 1,000 Computers Per Month, 1996-1997 Studies.

Period

1996 Study

1997 Study

Feb’97

28.0

Jan ’97

14.9

July-Dec ’96

10.0

Jan-Jun ’96

9.1

Feb ’96

14.4

Jan ’96

6.1

July-Dec ’95

3.0

Jan-Jun ‘95

1.5

1995

3.5

1994

0.2

Figure 2. Likelihood of a Virus Encounter.

The figure above refines the data by correcting for the size of the group and by representing the semi-annual and annual data as a monthly proportion during the mid-month of the surveyed period. This results in a rate of infection by some virus per 1,000 PCs per month. It shows that the likelihood of a medium or large North American organization having an encounter with a virus, grew from about one chance per 1,000 PCs per month in mid-1994 to approximately 28 chances per 1,000 PCs per month in February, 1997.

Certain viruses are more likely than others. And certain viruses are "growing" in prevalence (e.g. More copies of them exist, which are infecting more PCs, files and/or diskettes) while others are probably declining in population in society. We asked respondents which viruses affected their group for three periods of time: Jan.-Feb., 1997; Jul.-Dec. 1996; and Jan.-Jun. 1996, as well as how many times their group encountered each virus. Prevalence data for the ten most common viruses encountered in 1997 are shown as encounters per month per 1000 PCs for each of the three survey periods in the table and figure below.

Table 14. Infections per Month per 1,000 Computers, Top Ten Viruses

Jan/Feb ’97 incidents

2nd half ’96 incidents

1st half ’96 incidents

WM Concept

7.37

3.12

2.63

Word Macro

1.64

0.91

0.25

Form

2.09

0.34

0.27

Anti-EXE

0.62

0.39

0.28

WM Wazzu

1.81

0.15

0.03

Monkey B

0.70

0.12

0.11

NYB

0.27

0.15

0.19

WM Npad

0.41

0.12

0.05

Stealth B or C

0.28

0.11

0.09

Junkie

0.05

0.02

0.24

Figure 3. Infections per Month per 1,000 Computers, Top Ten Viruses

Common Viruses Growing – Nearly all viruses shown in Figure 3 have become more prevalent over the approximately one year covered by this data (note: accounting for the survey biases might dampen this effect). Only Junkie, from these top ten, appears to be in decline. The greatest growth rates are of the top macro viruses, such as WM.Concept and WM.Wazzu.

Macro Viruses: Rapid Growth – Of the top ten viruses, seven are at least five years old (NATAS, NYB, and WM.Concept were approximately 3, 3 and 1.6 years old at the time of this survey). Until WM.Concept, Natas and NYB, were apparently "growing" the fastest. By far, the rate of growth of WM.Concept is the fastest of any virus ever observed to infect computers of the general public. There are several reasons for its apparent rapid growth:

We should note that a virus which appears to be in decline may actually be increasing in prevalence. If a user is infected with an older virus that is easily dispatched with the product on hand, that user is likely to kill the virus without reporting it to management. If the virus is contained because of the effectiveness of anti-virus products, it is not likely to be remembered, and thus not likely to be reported to our survey researchers. It is those viruses which cause unpleasant experiences, data loss, massive infection, and which prove difficult to remove that are most likely to be recorded.

Internal Error Checking – The data for Figure 3 was obtained from a different set of survey questions than the data used in the encounter information in Figure 2. A total of 31,281 encounters were reported for January-February, 1997 (questions 4a and 4b), and a total of 22,416 infections by specifically named viruses were reported for 1997 (question 5a.) This leaves a difference of about 9,000 encounters which may have been from viruses not recorded in the tables for question 5A. Alternatively, the difference may be due to errors in estimation.

Table 15. Reality Check on Encounter Rate.

Question

Total

4A. How many virus encounters did you have during February 1997

20,388

4B. How many virus encounters did you have during January 1997

10,893

5a. Which viruses have affected your group’s PCs during 1997? How many times?

22,416

In the first two months of 1997, one virus – WM.Concept – accounted for about two thirds of all infections caused by the top ten viruses. In the first two months of 1996, the top three viruses (WM.Concept, Form, and Stealth Boot) accounted for about three quarters of all virus encounters. WM.Concept’s dominance last year is now being squeezed by other emerging macro viruses.

Figure 4. Relative Dominance of Top Ten Viruses, 1997

Figure 5. Dominance of Top Ten Viruses, 1996

Other Viruses Found – In addition to the viruses listed above, respondents reported a number of additional viruses, some of which are likely to appear in next year’s study. Many of these additional viruses are macro viruses. Table 16 below presents these viruses in alphabetical order.

Table 16. Other Viruses Reported, 1997.

Virus

Number Reported

15 years/Espejo/Esto te

1

Aloha

1

Anti-Alias

1

Anti-OC

1

Arachina

1

B-1

1

Barrotes

1

BLU

1

BOOT B

1

Boot Virus

8

Bupt/WelcomB

1

Cascade

1

D 1

1

Da'Boys

2

Dr. White

1

Dragon

1

Exabyte.3

1

EXEC

1

Form.A

3

Frankenstein

1

Fu_Manchu

1

Int 10

1

J&M

1

Jerusalem

6

Jerusalem.Mummy

1

Joshi

2

Junkie

1

Leandro

1

Meat Grinder

1

MICROSOFT

1

Mirrox General 1

1

Music_Bug

1

Natas

1

NYB

1

PacMan

1

Read IOSys

2

Sampo

2

Stealth.B-1

1

Stealth.Boot.H

1

Stoned

1

Stoned.Angelina

1

Stoned.Bloomington/NoInt

1

Stoned.Empire.Monkey

3

Stoned.No_Int

1

Tai-Pan.666/Doom2Death

1

Telecom.Boot

1

Tentacle

2

Trojan Horse

2

Typen

1

Urkel

1

V-Sign

2

WM.Alien

1

WM.Bandung

1

WM.CAP

1

WM.Concept.A-F

5

WM.Divina

1

WM.DMV

1

WM.Imposter

1

WM.Indonesia

1

WM.Irish

3

WM.Johny

1

WM.Lunch.A

1

WM.NOP

1

WM.Npad

2

WM.Nuclear

1

WM.Rapi

1

WM.ShowOff

1

WM.Wazzu.A-F

1

WM.Wazzu.C

1

WordPerfect virus

1

Means of Infection – Respondents were asked to identify the means of infection for their most recent virus incident, or encounter if they did not have a incident. In this survey, respondents could indicate more than one avenue of infection, and totals for 1997 exceed 100%. A comparison with the 1996 survey is shown below.

Table 17. Sources of Infection, 1996-1997

Source

1996

1997

A diskette, sales demo or similar

11%

8.05%

A diskette, repair/service person

3%

3.36%

A diskette, LAN manager/supervisor

1%

2.68%

A diskette, shrink-wrapped software

2%

4.36%

A diskette, malicious person intentionally planted

0%

1.01%

A diskette, brought from someone’s home

36%

42.28%

A diskette, other

21%

26.51%

On a distribution CD

0%

0.67%

A download from BBS, AOL, CompuServe, Internet

10%

16.11%

Other download (terminal emulation, client server)

2%

2.35%

Via e-mail as an attachment

9%

26.17%

Via an automated software distribution

0%

1.68%

While browsing on the World Wide Web

--

5.37%

Other

0%

5.03%

Don’t know

15%

7%

Figure 6. Infection Sources

It is not surprising that diskettes predominate as a vector for infection, since nine out of the top ten most prevalent viruses and 17 out of the top 20 were boot track viruses and could not travel by any other means. However, in the 1991 NCSA-Dataquest survey, the proportion of diskettes was even larger (87%), download sources were slightly lower, and e-mail attachment was not mentioned as a source or possible source.

Macro Virus Also Travels by E-mail and the Net – All viruses can, theoretically, be transferred by diskette, by e-mail, or by download. though all viruses can travel by diskette, only executable file-type and macro viruses can possibly travel by download or e-mail attachment. We looked at the top viruses to determine how they got to the organization. As may be seen in Table 18 and Figure 7, macro viruses are most likely to enter an organization via e-mail attachments, whereas boot viruses most often come via diskette. The home remains a common source of virus infection in offices.

Table 18. Sources of Infection, Boot and Macro Viruses, 1997.

Boot Macro
A diskette, sales demo or similar 2% 3%
A diskette, repair/service person 7% 1%
A diskette, LAN manager/supervisor 0% 1%
A diskette, shrink-wrapped software 2% 3%
A diskette, malicious person intentionally planted it 2% 0%
A diskette, brought from someone's home 26% 17%
A diskette, other 23% 12%
On a distribution CD 0% 0%
A download from BBS, AOL, CompuServe, Internet, etc. 9% 7%
Other download (terminal emulation, client server) 2% 2%
Via e-mail as an attachment 2% 36%
Via an automated software distribution 0% 0%
While browsing on the World Wide Web 5% 5%
Other 7% 2%
Don't Know 12% 9%

Figure 7. Sources of Infection, Boot and Macro Viruses, 1997.

Table 19 shows the analysis from the 1996 survey. In both surveys, e-mail was especially important as a transmission vehicle for macro viruses. The speed and international quality of e-mail will likely continue to contribute to the rapid spread of new and old macro viruses in coming years.

Table 19. Means of Infection Summary, 1996 Survey

E-Mail

Download from BBS, AOL, C/S, Internet or Other

All Viruses Except Word.concept

7.7%

11.7%

All Viruses

8.8%

11.5%

Word.concept Encounters

21.5%

17.8%

Word.concept Incidents

30.5%

14.2%

Effects of Viruses – For the most recent encounter or incident (if they had one), respondents were asked to identify the effect the virus had on their group (see Table 20. Effects of Viruses, 1996-1997 and Figure 8. Effects of Viruses, 1996-1997.) As was the case in the 1991 and 1996 surveys, by far the greatest problem that computer viruses cause is related to loss of productivity and includes: the express loss of productivity, PCs unavailable to users, loss of access to data, unreliable applications, and system crashes. Many users also thought corruption of PCs, applications, and data occurred via: screen message, interference, or lockup; corrupted files, lost data, and system crashes. It is noteworthy that in comparison to the 1996 survey, the impact of viruses on productivity seems to be diminishing, whereas the impact on confidence has increased.

Table 20. Effects of Viruses, 1996-1997

1996

1997

Loss of user confidence in the system

7%

26%

Threat of someone losing their job

3%

1%

Loss of productivity (machine, applications or data not available for some time)

81%

70%

Screen message, interference, or lockup

62%

54%

Lost data

39%

37%

Corrupted files

59%

57%

Loss of access to data (ie. on Server, Host, Mainframe, etc.)

49%

30%

Unreliable applications

35%

30%

PC was unavailable to the user

71%

59%

System Crash

30%

26%

Trouble saving files

54%

Trouble reading files

57%

Trouble printing

23%

Other (specify)

0%

3%

None

4%

12%

Don’t know

0%

1%

Figure 8. Effects of Viruses, 1996-1997

C. PC Virus Incidents

Overall Level of Virus Incidents -- Overall, 298 of the 300 respondents have had a virus infection in their organization at some time. In 1996, 30% of respondents had a virus incident in the preceding 14 months, nearly half (45%) of which occurred in January or February, 1996. In 1997, 81% reported an incident in the previous month!

Most Common Viruses – The 1997 survey asked three questions concerning which viruses affected the group:

5a. Which viruses have affected your group’s PCs during 1997? How many times?

5b. Which viruses affected your group’s PCs during the second half of 1996 (July-December)? How many times?

5c. Which viruses affected your group’s PCs during the first half of 1996 (January-June)? How many times?

We have tabulated answers in two forms: the percentage of respondents having an incident with the virus, and the total number of infected machines (sum of "how many times" question across respondents.)

Table 21. Most Commonly Found: Percent of Organizations Infected

Virus

Jan/Feb ‘97

2nd half of ‘96

1st half of ‘96

Anti-CMOS

10%

12%

8%

Anti-EXE

18%

16%

12%

Form

16%

18%

16%

Green Caterpillar

<1%

0%

0%

Jumper

0%

<1%

0%

Junkie

2%

2%

1%

Michelangelo

3%

2%

3%

Monkey B

17%

15%

12%

NATAS

<1%

1%

1%

NYB

10%

7%

5%

One Half

<1%

<1%

0%

Parity Boot

1%

1%

<1%

Ripper

3%

4%

3%

Stealth B or C

14%

13%

10%

Stoned (Monkey Empire)

13%

14%

15%

WelcomB

2%

2%

1%

Word Macro

15%

12%

7%

WM Concept

34%

31%

19%

WM Npad

4%

3%

1%

WM Wazzu

19%

10%

5%

WM MDMA

3%

2%

<1%

WM Colors

6%

<1%

0%

Excel Macro

1%

1%

<1%

XM Laroux

1%

<1%

<1%

XM Sofa

0%

0%

0%

Other (specify)

25%

19%

13%

None

5%

2%

3%

Don’t know

9%

22%

39%

Refused

<1%

<1%

<1%

The table below presents the "unweighted total" number of machines infected throughout the survey. Thus if only two respondents reported this virus at all, each reporting 100 infected machines in the time period, the number "200" would be presented in the table. As such, this table doesn’t show what percentage of organizations were infected with the virus, or what percentage of machines within the average organization were infected. But it does provide a sensitive measure of the success of a virus in infecting machines.

Several observations on this table:

Table 22. Viruses Most Commonly Found: Total Infections

Virus

Jan/Feb ’97

2nd half ’96

1st half ’96

Total

WM Concept

10750

13662

11481

35893

Word Macro

2392

3990

1104

7486

Form

3048

1478

1187

5713

Anti-EXE

906

1721

1240

3867

WM Wazzu

2632

659

135

3426

Monkey B

1021

512

497

2030

NYB

390

667

839

1896

WM Npad

602

540

202

1344

Stealth B or C

414

481

377

1272

Junkie

67

108

1040

1215

Stoned (Monkey Empire)

167

516

353

1036

Anti-CMOS

230

422

290

942

Excel Macro

100

70

50

220

WM MDMA

196

6

0

202

Michelangelo

17

45

128

190

Ripper

32

66

58

156

WelcomB

20

101

35

156

NATAS

0

79

23

102

One Half

11

10

0

21

WM Colors

7

5

0

12

XM Laroux

10

2

0

12

Jumper

0

10

0

10

Parity Boot

4

2

0

6

Green Caterpillar

1

0

0

1

XM Sofa

0

0

0

0

Figure 9. Relative Prevalence of Top Ten Viruses

PCs Affected by Incident – One of the most costly effects of a virus incident is the disruption caused by the investigation process required to determine the severity of the virus encounter or incident and isolate which PCs were affected. An entire network may be shut down only to find the virus was isolated to one or two PCs in the group. Respondents were asked how many PCs were originally suspected of having a virus and had to be examined. As a follow-up, they were asked how many PCs were actually infected when all analysis was complete. The same questions were posed for servers suspected and actually infected by the most recent virus incident.

The average incident infected 107 PCs and 1.8 servers, a reduction from last year’s reality. It would appear that organizations are improving at early detection. Last year’s fears proved to be modest compared with the realities of server infections, but this year, the suspicions exceeded the reality..

Table 23. PCs and Servers Suspected/Actually Infected During Most Recent Incident, 1996-1997

’96 Suspected ’96 Actual ’97 Suspected ’97 Actual
PC 131 135 94.6 107
Server 1.6 5.4 7.64 1.81

Cost of a Incident -- Respondents were asked to estimate the cost of the incident to their group. Data was gathered in several ways. The length of time it took the group to completely recover averaged 46.6 hours (range 0 to 336 hours) and a total of an average of 10-person-days (range 0 to 150 person-days). Ranges for these measures are shown in Figure 10 and Figure 11.

Figure 10. Length of Time to Completely Recover from Most Recent Incident, 1996-1997

Figure 11. Person-Days Most Recent Incident Cost the Group, 1996-1997

Despite the rather substantial impact on server availability, user down-time, and IS support time caused by the most recent computer virus incident, the average estimate of the actual costs for this problem was reported to be just $8,366, about the same as last year’s finding of $8,106. This probably underestimates the true costs of the incident for a couple of reasons: Most respondents were either analyst or manager levels in their organization, and would not customarily consider all of the related costs of productivity, loss of business, and other costs of down-time.

The range of reported costs is shown in Figure 12. Respondents reported widely varying costs, from a low of zero dollars, to one site which reported over $110,000 for a single computer virus incident.

Figure 12. Stated Cost of Virus Incident, 1996-1997.

D. Usage of Anti-Virus Products

Overall Level of Usage -- Virtually all respondents had one or more different anti-virus products available to them. It should be noted that a probable bias exists toward increased use of anti-virus products among survey respondents compared with non-respondents (see site selection bias in Methods section). But there may be another bias – to look good to the interviewer, and inflate the number of protected machines. Both biases would suggest that the installed base that is reported might be higher than the actual installed base throughout all organizations.

To examine this, we asked two more questions:

Anti-Virus Methods Employed – Respondents were asked to estimate the number of PCs which were protected by each of several methods: Respondents could choose more than one answer. Results are shown below for both the percentage of respondents using a method, and the number of machines protected by the method. If you add up the number of PCs protected by various methods, you find that 1,430,256 machines are protected by the methods; with only 728,798 machines represented in the study, we can conclude that each machine averages two protection methods.

Table 24. Desktop Virus Protection Methods Used

Protection

% Respondents

# of PCs

Users check diskettes and downloads for viruses.

64%

320,268

Anti-virus software scans hard drive for viruses every boot-up

68%

402,598

Anti-virus software scans hard drive for viruses every login

39%

194,526

Anti-virus software scans hard drive for viruses full time in the background

60%

289,740

Other periodic anti-virus detection on the desktop

41%

132,770

Other full-time anti-virus detection on the desktop

20%

58,881

Other (specify)

5%

31,473

None

1%

Don’t know

<1%

A closer look at desktop protection methods finds that only 16% of respondents used only one of the above methods of protection, 19% used two, and 32% used three. The distribution of respondents on this question, showing the number of methods used, is provided in the table below.

Table 25. Number of Desktop Protection Methods Used

1 2 3 4 5 6
16% 19% 32% 19% 11% 3%

Estimating the Effectiveness of Desktop Protection Approaches – We were interested in whether there is a relationship between the method of protection used and infection rate. To estimate infection rate, we divided the number of reported virus encounters in February, 1997 with the number of machines the respondent was responsible for. With this number, we correlated the percentage of machines using a given method. [Table 26]

No correlation is large enough to be statistically significant with this sample size. These small correlations are possibly because while an effective method may result in fewer disasters, it may not prevent introduction of viruses to a machine. And it does facilitate detection of a virus, when introduced, thus increasing reported incidents.

Table 26. Effectiveness in Virus Prevention, Desktop Encounters 1997

Protection

Correlation

Users check diskettes and downloads for viruses.

-.03

Anti-virus software scans hard drive for viruses every boot-up

-.03

Anti-virus software scans hard drive for viruses every login

+.05

Anti-virus software scans hard drive for viruses full time in the background

+.02

Other periodic anti-virus detection on the desktop

-.10

Other full-time anti-virus detection on the desktop

-.07

To examine this, we looked at virus disasters. Our measure of disaster was months since the most recent disaster. Respondents not reporting a disaster were eliminated. Results are shown in Table 27. A positive correlation means that the greater the percentage of desktop machines protected with this method, the longer the interval since most recent disaster. As can be seen, most correlations are small but negative, suggesting that none of the methods is effective in preventing disaster. We have trouble believing this interpretation. More likely is that a disaster triggers use of one or more methods, resulting in the negative correlations.

Table 27. Effectiveness in Averting Disasters

Protection

Correlation

Users check diskettes and downloads for viruses.

-.12

Anti-virus software scans hard drive for viruses every boot-up

-.06

Anti-virus software scans hard drive for viruses every login

-.09

Anti-virus software scans hard drive for viruses full time in the background

-.15

Other periodic anti-virus detection on the desktop

-.01

Other full-time anti-virus detection on the desktop

.03

Server Protection Methods -- Similarly, respondents were asked the number of servers which of two basic methods were used on file and application servers Periodic (anti-virus software scans server hard drive periodically), or full-time (anti-virus software scans hard drive for viruses full time in the background). Results are shown below.

Table 28. Server Virus Protection Methods Used

Protection

% Respondents

# of Servers

Anti-virus software scans hard drive for viruses periodically

56%

9,389

Anti-virus software scans hard drive for viruses full time in the background

54%

9,766

Other (specify)

7%

313

None

11%

Don’t know

3%

Refused

<1%

Table 29. Percent of Servers Running Either Periodic or Full-Time Scans, or Both

Periodic Only

Full-time Only

Both

40%

32%

28%

E-Mail Gateways – With the advent of macro viruses, careful monitoring of e-mail attachments has become more critical than ever. In the past, any infected file or boot virus dropper could be sent as an e-mail attachment. Double clicking on it in Windows 95 might invoke the program, or invoke an extraction utility such as WinZip. Once executed, the file virus would be able to gain control of the machine. (The boot virus dropper would fail under Windows 95, however, which blocks writes to the boot area while it is running.) But documents are attached to e-mail far more often than program files, and Word documents are now home to Word macro viruses. While users can still extract documents and manually scan them for macro viruses, e-mail gateways which monitor attachments are becoming a good idea. Of course, they won’t be able to see a virus in an attachment that is zipped and password-protected, or that is in an attachment that uses a "non-standard" compression approach. Nonetheless, this approach is gaining acceptance, and we asked the question, "What percentage of e-mail gateways have full-time anti-virus software installed now?" Five out of six respondents answered this question, with a mean of 29.2% gateways protected.

Proxy Servers and Firewalls – Separating the inside of the organization from the outside world is the job of proxy servers and firewalls. Because viruses can pass through network connections, virus filtration is increasingly added to these protection tools. The survey asked the percentage of these devices with full-time virus screening. Results are shown in the table below.

Table 30. E-mail, proxy servers, and firewalls with virus protection

Protected Device Percent Protected
e-mail gateways protected 29.2
proxy servers protected 24.5
firewalls protected 29.4

Since viruses generally come to a site unexpectedly from the outside, it is expected that sites with good protection will have about the same number of virus incidents or encounters as those with poor protection. However, sites with good protection should be successful at preventing an encounter from a virus from becoming a incident. That is, good protection should limit the number of PCs, files, or diskettes infected by a virus after it encounters the site.

New macro viruses apparently caused incidents even in sites with good, full time protection. This is likely due to the fact that such viruses are new, anti-virus vendors took some time building full-time protection into their products, and respondent sites took further time rolling the new versions of protective software out to their end users. During this time, these sites were not using full-time protection against this new virus and were vulnerable.

It is clear that increased full time protection, especially at the desktop is needed. Smaller organizations and home and small business PCs are probably less likely to have good, virus protection strategies as the organizations in this survey who averaged over 2,400 PCs, most of whom also had a "computer virus expert" on staff. Getting the full time protection of both desktop PCs and servers over 60% for all classes of users is an appropriate goal to severely cripple computer viruses survival in the world-wide computing environment.

Appendix A: Questionnaire

NCSA Virus Survey

3/6/96

1. Survey calling dialogue:

Hello, may I speak with <NAME2 > (<FNAME > <LNAME >) or the the person most responsible for computer virus management in your organization?

Hello, this is _____ with Paria Group. I am calling on behalf of the National Computer Security Association (NCSA) to gather some confidential information about computer viruses. The information from this study will be published to better educate the business community. I understand that you are responsible for computer virus management at your site. May I take a few minutes to ask you some questions?

Hello, may I speak with NAME OF CONTACT? Hello, this is _____ from Paria Group calling in behalf of the National Computer Security Association (NCSA) to gather some confidential information about computer viruses. The information from this study will be published to better educate the business community. I understand that you are responsible...

Surveyor:___________________________

Date:_______________________________

NCSA Virus Survey

3/1/97

S2. How many personal computers in North America are you responsible for in terms of virus knowledge, prevention, and software? __________Dk Ref (If less than 200, terminate)

S2a. Which desktop PC operating systems does your organization? have running?

How many desktop PCs run each system?

OS

# of PCs

DOS only, no Windows 1
Windows 3.1 2
Windows 95 3
Windows NT 4
OS/2 5
Macintosh 6
Unix 7
Other (specify) 8
None 9
Don’t know 10
Refused 11

S3. How many file and application servers are you responsible for in terms of virus knowledge, prevention and software? __________Dk Ref

S3a. Which LAN server operating systems does your organization have running?

How many servers run each system?

NOS

# of servers

Novell NetWare 3.x 1
Novell NetWare 4.x 2
Windows NT 3
IBM OS/2 LAN Server 4
IBM LAN Server 5
Banyan Vines 6
Unix 7
Other (specify) 8
None 9
Don’t know 10
Refused 11

1. Does your organization have a formal virus tracking procedure?

Yes 1
No 2
Don’t know 3
Refused 4

2. What percent of virus incidents in your group are you informed of or likely to know about? __________Dk Ref

3. To the best of your knowledge, has a computer virus ever been discovered in any PC, diskette or file in your organization?

Yes 1
No (skip to Q9) 2
Don’t know 3
Refused 4

For the remainder of the survey, a "virus encounter" will be defined as an event or incident where viruses were discovered on any PCs, diskettes, or files.

4a-e. How many viruses encounters did you have during:

February 1997 __________Dk Ref

January 1997 __________Dk Ref

Second half of 1996 (July-December) __________Dk Ref

First half of 1996 (January-June) __________Dk Ref

All of 1995 __________Dk Ref

5a. Which viruses have affected your group’s PCs during 1997? How many times? (do not read the list)

5b. Which viruses affected your group’s PCs during the second half of 1996 (July-December)? How many times?

5c. Which viruses affected your group’s PCs during the first half of 1996 (January-June)? How many times?

Virus

‘97

incidents

2nd half of ‘96

incidents

1st half of ‘96

incidents

Anti-CMOS 1 1 1
Anti-EXE 2 2 2
Form 3 3 3
Green Caterpillar 4 4 4
Jumper 5 5 5
Junkie 6 6 6
Michelangelo 7 7 7
Monkey B 8 8 8
NATAS 9 9 9
NYB 10 10 10
One Half 11 11 11
Parity Boot 12 12 12
Ripper 13 13 13
Stealth B or C 14 14 14
Stoned (Monkey Empire) 15 15 15
WelcomB 16 16 16
Word Macro 17 17 17
WM Concept 17a 17a 17a
WM Npad 17b 17b 17b
WM Wazzu 17c 17c 17c
WM MDMA 17d 17d 17d
WM Colors 17e 17e 17e
Excel Macro 18 18 18
XM Laroux 18a 18a 18a
XM Sofa 18b 18b 18b
Other (specify) 19 19 19
None 20 20 20
Don’t know 21 21 21
Refused 22 22 22

For the remainder of the survey, the word "group" refers to those PCs and servers for which you are responsible.

5d. Compared to this time last year, do you feel virus problems in the computing industry are: (read list)

Much worse 1
Somewhat worse 2
About the same 3
Somewhat better 4
Much better 5
Don’t know 6
Refused 7

5e. Compared to this time last year, do you feel Word Macro Virus problems in the computing industry are: (read list)

Much worse 1
Somewhat worse 2
About the same 3
Somewhat better 4
Much better 5
Don’t know 6
Refused 7

5f. Compared to this time last year, do you feel the Excel Macro Virus problems in the computing industry are: (read list)

Much worse 1
Somewhat worse 2
About the same 3
Somewhat better 4
Much better 5
Don’t know 6
Refused 7

For the remainder of the survey, a "virus disaster" will be defined as a virus encounter where a minimum of 25 PCs, diskettes, or files were infected by the same virus at relatively the same time.

6. Has your group had a virus disaster anytime since January 1996

Yes 1
No (skip to Q7) 2
Don’t know 3
Refused 4

6a. When was the month and year of your most recent disaster?

January 1997 1
February 1997 2
March 1997 3
January 1996 4
February 1996 5
March 1996 6
April 1996 7
May 1996 8
June 1996 9
July 1996 10
August 1996 11
September 1996 12
October 1996 13
November 1996 14
December 1996 15
Don’t know 16
Refused 17

6b. What was the name of the virus in your most recent disaster? (do no read list)

Anti-CMOS 1
Anti-EXE 2
Form 3
Green Caterpillar 4
Jumper 5
Junkie 6
Michelangelo 7
Monkey B 8
NATAS 9
NYB 10
One Half 11
Parity Boot 12
Ripper 13
Stealth B or C 14
Stoned (Monkey Empire) 15
WelcomB 16
Word Macro 17
WM Concept 17a
WM Npad 17b
WM Wazzu 17c
WM MDMA 17d
WM Colors 17e
Excel Macro 18
Laroux 18a
Sofa 18b
Other (specify) 19
None 20
Don’t know 21
Refused 22

6c. How many PCs were initially suspected of having the _____________virus? __________Dk Ref

6d. How many PCs were actually found to be infected? __________Dk Ref

6e. How many SERVERS were initially suspected of having the virus? __________Dk Ref

6f. How many SERVERS were actually found to be infected? __________Dk Ref

6g. How long were any servers "down" while dealing with the disaster? (server hours) __________Dk Ref

6h. How long did it take for your group to completely recover? (hours) __________Dk Ref

6I. How many person days did the disaster cost your group? [ __________Dk Ref

6j. How many dollars did the disaster cost your group? __________Dk Ref

7. Which of the following effects occurred in your group with the most recent virus disaster or encounter? (Read the list) (Check all that apply)

Loss of user confidence in the system 1
Threat of someone losing their job 2
Loss of productivity (machine, applications or data not available for some time) 3
Screen message, interference, or lockup 4
Lost data 5
Corrupted files 6
Loss of access to data (ie. on Server, Host, Mainframe, etc.) 7
Unreliable applications 8
PC was unavailable to the user 9
System Crash 10
Trouble saving files 11
Trouble reading files 12
Trouble printing 13
Other (specify) 14
None 15
Don’t know 16
Refused 17

8. How did your most recent virus disaster or encounter come to your site? (Check all that apply)

A diskette, sales demo or similar 1
A diskette, repair/ service person 2
A diskette, LAN manager / supervisor 3
A diskette, shrink-wrapped software 4
A diskette, malicious person intentionally planted it 5
A diskette, brought from someone’s home 6
A diskette, other 7
On a distribution CD 8
A download from BBS, AOL, CompuServe, Internet, etc. 9
Other download (terminal emulation, client server) 10
Via e-mail as an attachment 11
Via an automated software distribution 12
While browsing the World Wide Web 13
Other (specify) 14
None 15
Don’t know 16
Refused 17

9a Which anti-virus products are you running at the desktop PC level? How many desktop PCs are running each product?

9b. Which anti-virus products are you running at the server level? How many servers are running each product?

Product

# of DT PCs

# of servers

Avast, Avast 32 1 1
Cheyenne, Inoculan 2 2
Command Software, FPROT 3 3
Dr. Solomon’s A.V. Toolkit 4 4
Eliashim, ViruSafe 5 5
ESaSS BV, Thunder Byte AV 6 6
IBM, IBM Antivirus 7 7
INTEL LanDesk Protect 8 8
Iris Software, IRIS AV 9 9
McAfee, ViruScan 10 10
Norman Data, Norman Virus Control 11 11
ON Technology 12 12
Quantum Leap, Anti Virus 13 13
Symantec, CPAV Central Point AV 14 14
Symantec, NAV Norton Antivirus 15 15
Trend Micro Inc., PC-Cillin 16 16
Other (specify) 17 17
None 18 18
Don’t know 19 19
Refused 20 20

10a) On the desktop PC level, which of the following anti-virus software protection methods are used? How many PCs use each method? (Read the list)

Protection

# of PCs

Users check diskettes and downloads for viruses. 1
Anti-virus software scans hard drive for viruses every boot-up 2
Anti-virus software scans hard drive for viruses every login 3
Anti-virus software scans hard drive for viruses full time in the background 4
Other periodic anti-virus detection on the desktop 5
Other full-time anti-virus detection on the desktop 6
Other (specify) 7
None 8
Don’t know 9
Refused 10

10b) On the file server level, which of the following anti-virus software protection methods are used? How many servers use each method? (Read the list)

Protection

# of Servers

Anti-virus software scans hard drive for viruses periodically 1
Anti-virus software scans hard drive for viruses full time in the background 2
Other (specify) 3
None 4
Don’t know 5
Refused 6

10c. What percentage of e-mail gateways have full-time anti-virus software installed now? _________ Dk Ref

10d. What percentage of proxy servers have full time anti-virus software installed now? ___________ Dk Ref

10e. What percentage of firewalls have anti-virus software installed now? ____________ Dk Ref

10f. What percentage of desktop PC’s have NO anti-virus software installed? _____________ Dk Ref

10g. What percentage of desktop PC’s have anti-virus software installed, but not running? ____________ Dk Ref

11. What department are you in?

Accounting / Finance 1
Customer Service/support 2
Data Processing 3
Education / Training 4
Engineering 5
General Administration / Management 6
Manufacturing / Production 7
Research & Development 8
MIS/IS (Management Information Systems) 9
Personnel / Human Resources 10
Public relations / Communications 11
Purchasing 12
Sales / Marketing 13
Other (specify) 14
None 15
Don’t know 16
Refused 17

12. What is your job title?

Director of Data Processing / Information Systems 1
Director of Computer Security 2
Manager of Data Processing 3
Manager of MIS/IS 4
Manager of Computer Security 5
Systems Analyst/Programmer 6
Support Specialist 7
Systems Manager/ PC Network /LAN Manager 8
Other (specify) 9
None 10
Don’t know 11
Refused 12

13. What is your organization’s primary line of business?

Accounting 1
Agriculture 2
Business Services 3
Communications 4
Construction 5
Education 6
Engineering 7
Finance 8
Government 9
Healthcare 10
Insurance 11
Legal 12
Manufacturing 13
Mining 14
Non-Profit 15
Professional Services 16
Real estate 17
Retail 18
Transportation 19
Utilities 20
Wholesale 21
Other (specify) 22
Don’t know 23
Refused 24

F1) Would you like to contribute a case history of a virus disaster (NCSA would interview you further)?

Yes 1
No 2
Don’t know 3
Refused 4

F2) Conceptually, would you be willing to participate in a brief, monthly or quarterly report in which you report all virus encounters during that period to NCSA? ______________ Dk Ref

F3) Would you also, conceptually, be willing to provide sample of actual virus encounters? ______________ Dk Ref

Contact Name:_____________________________________________________

Contact Phone:_____________________________________________________

Appendix B: Summary of Survey Execution Statistics

Total hours 187.5

Total calls 1398

Refusals 264

Completed 300

Calls / complete 4.6

Complete / hour 1.6