TAC KBP Event Argument - Comprehensive Training and Evaluation Data 2016-2017

Item Name: TAC KBP Event Argument - Comprehensive Training and Evaluation Data 2016-2017
Author(s): Joe Ellis, Jeremy Getman, Song Chen, Stephanie Strassel
LDC Catalog No.: LDC2020T18
ISBN: 1-58563-940-0
ISLRN: 982-513-576-529-0
DOI: https://doi.org/10.35111/kg3z-4q43
Release Date: August 17, 2020
Member Year(s): 2020
DCMI Type(s): Text
Data Source(s): newswire, discussion forum
Project(s): TAC
Application(s): event detection, knowledge base population
Language(s): English, Mandarin Chinese, Spanish
Language ID(s): eng, cmn, spa
License(s): LDC User Agreement for Non-Members
Online Documentation: LDC2020T18 Documents
Licensing Instructions: Subscription & Standard Members, and Non-Members
Citation: Ellis, Joe, et al. TAC KBP Event Argument - Comprehensive Training and Evaluation Data 2016-2017 LDC2020T18. Web Download. Philadelphia: Linguistic Data Consortium, 2020.
Related Works: View


TAC KBP Event Argument - Comprehensive Training and Evaluation Data 2016-2017 was developed by the Linguistic Data Consortium (LDC) and contains training and evaluation data produced in support of the 2016 TAC KBP Event Argument Linking Pilot and Evaluation tasks and the 2017 TAC KBP Event Argument Linking Evaluation task.

Text Analysis Conference (TAC) is a series of workshops organized by the National Institute of Standards and Technology (NIST). TAC was developed to encourage research in natural language processing and related applications by providing a large test collection, common evaluation procedures, and a forum for researchers to share their results. Through its various evaluations, the Knowledge Base Population (KBP) track of TAC encourages the development of systems that can match entities mentioned in natural texts with those appearing in a knowledge base and extract novel information about entities from a document collection and add it to a new or existing knowledge base.

The Event Argument Extraction and Linking task required systems to extract event arguments (entities or attributes playing a role in an event) from unstructured text, indicate the role they play in an event, and link the arguments appearing in the same event to each other. Since the extracted information must be suitable as input to a knowledge base, systems constructed tuples indicating the event type, the role played by the entity in the event, and the most canonical mention of the entity from the source document. The event types and roles were drawn from an externally-specified ontology of 31 event types, which included financial transactions, communication events, and attacks. For more information about Event Argument Extraction and Linking, refer to the track home page on the NIST TAC website.


Source data for the annotations in this corpus was Chinese, English and Spanish newswire and discussion forum text collected by LDC. The 2016 pilot set source documents are available in this corpus; the 2016 and 2017 evaluation source documents are available in TAC KBP Evaluation Source Corpora 2016-2017 (LDC2019T12). Annotation data is presented as UTF-8 encoded tab delimited or XML files.

A summary of the data is below:

Year Set Source Docs Manual Responses Assessments Entities Fillers Event Hoppers
2016 pilot 2092 98 2,689 2,923 1,308 1,500
2016 evaluation 0* 628 7,697 17,681 4,544 6,799
2017 evaluation 0* 0 0 17,896 5,995 8,022


*NOTE: source documents for the 2016 and 2017 evaluations are available separately as indicated above.


This material is based on research sponsored by Air Force Research Laboratory and Defense Advance Research Projects Agency under agreement number FA8750-13-2-0045. The U.S. Government is authorized to reproduce and distribute reprints for Governmental purposes notwithstanding any copyright notation thereon. The views and conclusions contained herein are those of the authors and should not be interpreted as necessarily representing the official policies or endorsements, either expressed or implied, of Air Force Research Laboratory and Defense Advanced Research Projects Agency or the U.S. Government.


Please view this ERE sample (XML).


None at this time.

Available Media

View Fees

Login for the applicable fee