Global Database of Disaster Responses
We coordinated a collaborative five-year project with researchers in the business school and department of computer and information science of a university in the United States to build, arguably, the largest database on disaster aid at the international level. The dataset covers monetary and in-kind donation from firms, governments, multinational agencies, and non-governmental organizations reported in news media to relief and recovery from all major disasters that affected the world from 1990 to 2019. The coded data of corporate aid comprise 96,858 donations from 40,170 firms from 84-headquarters countries to 4,706 natural disasters that hit 208 countries in the period 2003-2019.
Collecting Data. We used the following procedure to track disaster donations:
- We obtained data on epidemic outbreaks, natural disasters, terrorist attacks and technological accidents from a variety of sources. First, we used the International Disaster Database (EM-DAT) from the Centre for Research on the Epidemiology of Disasters that register disasters based on the following criteria: 10 or more people killed, 100 or more people affected, a declaration of a state of emergency, or a call for international aid. Further information at http://www.emdat.be/. Second, to overcome the data inaccuracies and missing data in EM-DAT, we obtained data from the reinsurance company Swiss Re and the Financial Tracking System (FTS) from the United Nations Office for Coordination of Humanitarian Affairs (UNOCHA).
-
We automated code in Python to identify disaster donations in news reports using Factiva, Google, and Lexis Nexis. The search range was within a year from the official start date. A story is relevant for our database if the headline or body is in the results of a Boolean search that has the combination of the affected country, the type of the disaster, and in some cases, the name of the disaster. Specifically, the Boolean combinations are as follows:
Specifically, the Boolean combinations are as follows:- The affected country.
- Event. Derivations of:
- Epidemic: “pandemic” OR “epidemic”
- Mass movement: “landslide” OR “avalanche” OR “rockfall” OR “subsidence”
- Earthquake: “seismic” OR “quake” OR “earthquake” OR “tsunami”
- Flood: “flood”
- Storm: “storm” OR “typhoon” OR “cyclone” OR “hurricane” OR “tornado”
- Volcano: “volcano” OR “volcanic” OR “eruption”
- Technological accident: “accident” OR “explosion”
- Terrorism: “terrorist” OR “attack”
- Action. Derivations of: “donation” OR “donate” OR “donated” OR “donating” OR “pledge” OR “pledged” OR “pledging” OR “give” OR “gave” OR “given” OR “giving.”
- Disaster name, when available.
An example of a Boolean combination is: [03/11/2011-03/11/2012]; (“Japan” or “Japanese” or “Japan’s” or “Japans” ) and (“tsunami” or “earthquake” or “quake” or “disaster”) and (“donation” orR “donate” or “pledge” or “pledging” or “give” or “gave” or “given” or “giving”).
-
To make over 2,310,000 electronic reports computationally tractable, we apply differential language analysis using JavaScript Object Notation (i.e., JSON and AJAX) to parse the data. We code the following fields by article:
- Actor: Entity making the donation.
- Actual donation.
- In case of in-kind donations, the characteristics of the product or service were recorded (e.g., 1,000 bottles of water; a team of nine technicians) and monetized using either current prices applicable in the affected country (e.g., the average price of one litter of bottled water, the daily man-power wage for a specific professional or technician) or an equivalent pecuniary value based on other firms’ reporting of their donation to the same disaster.
- In case of donations reported in a currency different than the dollar, they were converted using the currency exchange rate of the day of the donation.
- Actual donation.
- In case of in-kind donations, the characteristics of the product or service were recorded (e.g., 1,000 bottles of water; a team of nine technicians) and monetized using either current prices applicable in the affected country (e.g., the average price of one litter of bottled water, the daily man-power wage for a specific professional or technician) or an equivalent pecuniary value based on other firms’ reporting of their donation to the same disaster.
- In case of donations reported in a currency different than the dollar, they were converted using the currency exchange rate of the day of the donation.
- Donations Toward Market Factors.
- The donation reports were coded for whether the target was associated with factors that underpin the functioning of the market. For instance, if the donation report states that the company donated for “power”, “generators”, “communication”, “airport”, “transport”, “roadways”, “emergency housing”, “rebuilding”, “restoring”, “reconstruction”, “schools”, etc., the donation would receive a one, and a zero otherwise.
- Employee-driven donation. When the news article mentioned that the donation was an initiative of the employees (and, for example, the company is matching whatever the employees collected), a binary variable took value 1.
- Direct Impact: When the news article mentioned that the disaster affected the organization physically in any way (e.g., corporate assets such as buildings were damaged) and/or employees were injured, a binary variable took value 1.
- To increase the relevance of the output (for example, some news reports were a series of articles with no relevance to the study but whose combination would make the report to be included in the outcome), the search was qualified with the following filtering process:
- The name of the country had to be within 50 words of the type of the disaster or the word “disaster.”
- Entities and the act of donating were parsed:
- The entities per article were extracted and grouped in three categories: organization (e.g., Tepco), location (e.g., Canada), and individual (e.g., Barack Obama).
- The verb identifying the act of donating had to be within 30 words of an entity
- We hired independent researchers to conduct two different procedures to verify the quality of the dataset using third-party sources such as company sustainability reports. We randomly selected five percent of the events (156) for the period 2003-2013 and researchers searched reports using Google, Lexis Nexis, and Factiva. From this procedure, 5.1 percent of the selected events (8) had data inaccuracies. About 60 percent of these errors were associated with monetizing the in-kind value of donations, with less than 8% of the donations were incorrectly marked. The rest of sample of discrepancies were due to missing data on the nature of donor’s business.
- We run another random draw excluding previously evaluated cases and the researchers repeated the analysis. No other discrepancies were found.
- We compared our data with third-party sources:
- We had access to exclusive information of donation for the 2010 tsunami and earthquake in Chile via the Chilean government. By comparing our database with the list of donors given by the Chilean government, we found that our dataset comprised 68 percent of the official source. Our tracking did not include donating frequency of small- and medium-sized Chilean, non-multinational enterprises. In terms of magnitude, our dataset accounted for 92 percent of the total corporate aid for the event.
- We worked with staff members of the United Nations Office for Coordination of Humanitarian Affairs (UNOCHA) to compare our database with the Financial Tracking System (FTS). This is a global database that records self-reported international humanitarian aid for different humanitarian crises. The FTS covered about seven percent of our firm donations and 65 percent of our government and NGO donations.
- The U.S. Chamber of Commerce Foundation maintains Disaster Corporate Aid Trackers that are self-reported records for company response to disasters that focus on U.S. firms. Their data start in 2010 for selected disasters, particularly in the U.S., and account for 11 percent of our database.
Assessing the Quality of Data. We used the followed procedure to check the accuracy of our collection:
- We hired research assistants to run another set of random checks in 2020 for all the data in the database and did not find any discrepancy.
[1] There were spelling mistakes in some articles.
[2] For information about the method of collection of FTS data and their verification, visit the following site: http://fts.unocha.org/pageloader.aspx?page=AboutFTS-Data.
[3] These data are available at https://www.uschamberfoundation.org/corporate-aid-trackers.