New York City, June 8-9, 2006

CoNLL in previous years: 1997 1998 1999 2000 2001 2002 2003 2004 2005 | SIGNLL

CoNLL-X

The Tenth Conference on Computational Natural Language Learning

Call for Papers

CoNLL is the yearly conference organized by SIGNLL (the ACL Special Interest Group on Natural Language Learning). Previous CoNLL meetings were held in Madrid (1997), Sydney (1998), Bergen (1999), Lisbon (2000), Toulouse (2001), Taipei (2002), Edmonton (2003), Boston (2004), and Ann Arbor (2005). This year, CoNLL will be collocated with HLT-NAACL in New York City.

See http://staff.science.uva.nl/~erikt/signll/ and http://staff.science.uva.nl/~erikt/signll/conll/ for more information about SIGNLL and CoNLL.

CoNLL is an international conference for research on natural language learning. We invite submission of papers about natural language learning topics, including, but not limited to:

  • Computational models of human language acquisition
  • Computational models of the evolution of language
  • Machine learning methods applied to natural language processing tasks (speech processing, phonology, morphology, syntax, semantics, discourse processing, language engineering applications)
  • Statistical methods (Bayesian learning, graphical models, kernel methods, statistical models for structured problems)
  • Symbolic learning methods (rule induction and decision tree learning, lazy learning, inductive logic programming, analytical learning, transformation-based error-driven learning)
  • Biologically-inspired methods (Neural Networks, Evolutionary Computing)
  • Reinforcement learning
  • Active learning, ensemble methods, meta-learning
  • Learning architectures for structural and relational NLP tasks
  • Computational learning theory analysis of language learning
  • Empirical and theoretical comparisons of language learning methods
  • Models of induction and analogy in linguistics

Special Topic of Interest

Apart from the topics listed above, this year we wish to encourage the submission of papers that propose learning theories, architectures, algorithms, methods, or techniques for improving the robustness of learning-based NLP systems.

One important type of brittleness in current learning-based NLP systems is domain dependence. Since learning is mainly performed in a supervised setting, even slight differences between training corpora and test corpora (text genre, style, new vocabulary, etc.) may cause substantial degradation in the performance of a system. This fact has been widely reported in the NLP literature and also was clearly observed in the CoNLL-2005 shared task evaluation on Semantic Role Labeling.

In this direction, we encourage the submission of papers addressing the portability and adaptation of learning-based systems to changing application domains. Transfer learning, domain adaptation, bootstrapping, semi-supervised learning, active learning, etc. are some keywords that might apply here.

Moreover, the traditional decomposition of natural language processing into a pipeline of specialized linguistic analyzers can also make end-to-end systems fragile. The assumption that each level can be satisfactory resolved before advancing to the following processor is clearly false given the current state-of-the-art for most tasks. Experience suggests that error propagation through cascades of processors may in aggregate severely degrade performance on the final task. One obvious and appealing solution (but also more complex) is to try to jointly model several subtasks at the same time, both at the learning and inference stages. This can allow systems to capture correlations between stages, searching for global solutions, rather than greedily maximizing local quality. However, practical constraints argue that some decomposition is necessary for efficient learning and inference. Thus, papers addressing the issues involved in processing across multiple linguistic layers will be also welcome.

Shared Task: Multilingual Dependency Parsing

See: http://nextens.uvt.nl/~conll/

Best Paper Award

For the first time in CoNLL series, a Best Paper Award will be given to the authors of the highest quality paper in the conference. We are looking for scientific contributions which make advance the state of the art in Natural Language Learning. The most important aspects in judging the quality of a paper for this award will be: originality, innovativeness, relevance, and impact of the presented research.

Invited Speakers

Walter Daelemans
CNTS, University of Antwerp, Belgium

Michael Collins
CSAIL, Massachusetts Institute of Technology, USA

Main Session Submissions

A paper submitted to CoNLL-X must describe original, unpublished work. Submit a full paper of no more than 8 pages in PDF format by March 5 2006, electronically through the web form at: http://www.softconf.com/start/CoNLL06/submit.html

Only electronic submissions will be accepted. The submitted paper should be in two column format and follow the HLT-NAACL style (see http://nlp.cs.nyu.edu/hlt-naacl06/cfp.html). Authors who cannot submit a PDF file electronically should contact the program co-chairs.

Since reviewing will be blind, the paper should not include the authors' names and affiliations, and there should be no self-references that reveal the authors' identity. In the submission form, you will be asked for the following information: paper title, authors' names, affiliations, and email addresses, contact author's email address, a list of keywords, abstract, and an indication of whether the paper has been simultaneously submitted to other conferences (and if so which conferences). The contact author of an accepted paper under multiple submissions should inform the program co-chairs immediately whether he or she intends the accepted paper to appear in CoNLL-X. A paper that appears in CoNLL-X must be withdrawn from other conferences.

Authors of accepted submissions are to produce a final paper to be published in the proceedings of the conference, which will be available at the conference for participants, and distributed afterwards by ACL. Final papers must follow the HLT-NAACL style and are due April 21, 2006.

Shared Task Submissions

Papers submitted to the CoNLL-X shared task must have a maximum length of 4 pages and describe the learning approach and the results obtained on the development sets. They are due March 17, 2006. See the shared task web page for complete submission instructions, concrete formats, and styles.

A special section of the CoNLL-X proceedings will be devoted to a comparison and analysis of the results and to a description of the approaches used.

Important Dates

Deadline for main session paper submission: March 5, 2006
Notification of acceptance of main session papers: April 9, 2006
Deadline for camera-ready papers: April 21, 2006
Conference: June 8-9, 2006

The deadlines for the shared task can be found on the shared task web page.

Conference Organizers

Lluís Màrquez
Software Department
Polytechnical University of Catalunya
Barcelona, Catalunya, Spain
lluism (at) lsi.upc.edu

Dan Klein
Computer Science Division
University of California at Berkeley
Berkeley, CA, USA
klein (at) cs.berkeley.edu

Shared Task Organizers

Sabine Buchholz
Toshiba Research Europe Ltd (UK)
sabine.buchholz (at) crl.toshiba.co.uk

Amit Dubey
University of Edinburgh (UK)
adubey (at) inf.ed.ac.uk

Yuval Krymolowski
University of Haifa (Israel)
yuval (at) cs.haifa.ac.il

Erwin Marsi
Tilburg University (The Netherlands)
E.C.Marsi (at) uvt.nl

Information Officer

Erik Tjong Kim Sang
University of Amsterdam (The Netherlands)
erikt (at) science.uva.nl

Program Committee

  • Eneko Agirre, University of the Basque Country, Spain
  • Regina Barzilay, Massachusetts Institute of Technology, USA
  • Thorsten Brants, Google Inc., USA
  • Xavier Carreras, Polytechnical University of Catalunya, Spain
  • Eugene Charniak, Brown University, USA
  • Alex Clark, Royal Holloway University of London, UK
  • James Cussens, University of York, UK
  • Walter Daelemans, University of Antwerp, Belgium
  • Hal Daumé, ISI, University of Southern California, USA
  • Radu Florian, IBM, USA
  • Dayne Freitag, Fair Isaac Corporation, USA
  • Daniel Gildea, University of Rochester, USA
  • Teg Grenager, Stanford University, USA
  • Marti Hearst SIMS, UC Berkeley, USA
  • Philipp Koehn, University of Edinburgh, UK
  • Roger Levy, University of Edinburgh, UK
  • Rob Malouf, San Diego State University, USA
  • Chris Manning, Stanford University, USA
  • Yuji Matsumoto, Nara Institute of Science and Technology, Japan
  • Andrew McCallum, University of Massachusetts Amherst, USA
  • Rada Mihalcea, University of North Texas, USA
  • Alessandro Moschitti, University of Rome Tor Vergata, Italy
  • John Nerbonne, University of Groningen, The Netherlands
  • Hwee-Tou Ng, National University of Singapore, Singapore
  • Franz Josef Och, Google Inc., USA
  • Miles Osborne, University of Edinburgh, UK
  • David Powers, Flinders University, Australia
  • Ellen Riloff, University of Utah, USA
  • Dan Roth, University of Illinois at Urbana-Champaign, USA
  • Anoop Sarkar, Simon Fraser University, Canada
  • Noah Smith, Johns Hopkins University, USA
  • Suzanne Stevenson, University of Toronto, Canada
  • Mihai Surdeanu, Polytechnical University of Catalunya, Spain
  • Charles Sutton, University of Massachusetts Amherst, USA
  • Kristina Toutanova, Microsoft Research, USA
  • Antal van den Bosch, Tilburg University, The Netherlands
  • Janyce Wiebe, University of Pittsburgh, USA
  • Dekai Wu, The Hong Kong University of Science & Technology, Hong Kong
http://www.cnts.ua.ac.be/conll   Last update: March 31, 2006   erikt (at) science.uva.nl