Deep linguistic processing

Deep linguistic processing is a natural language processing framework which draws on theoretical and descriptive linguistics. It models language predominantly by way of theoretical syntactic/semantic theory (e.g. CCG, HPSG, LFG, TAG, the Prague School). Deep linguistic processing approaches differ from "shallower" methods in that they yield more expressive and structural representations which directly capture long-distance dependencies and underlying predicate-argument structures.[1]
The knowledge-intensive approach of deep linguistic processing requires considerable computational power, and has in the past sometimes been judged as being intractable. However, research in the early 2000s had made considerable advancement in efficiency of deep processing.[2][3] Today, efficiency is no longer a major problem for applications using deep linguistic processing.

Contrast to "shallow linguistic processing"

Traditionally, deep linguistic processing has been concerned with computational grammar development (for use in both parsing and generation). These grammars were manually developed, maintained and were computationally expensive to run. In recent years, machine learning approaches (also known as shallow linguistic processing) have fundamentally altered the field of natural language processing. The rapid creation of robust and wide-coverage machine learning NLP tools requires substantially lesser amount of manual labor. Thus deep linguistic processing methods have received less attention.

However, it is the belief of some computational linguists that in order for computers to understand natural language or inference, detailed syntactic and semantic representation is necessary. Moreover, while humans can easily understand a sentence and its meaning, shallow linguistic processing might lack human language 'understanding'. For example:[4]

a) Things would be different if Microsoft were located in Georgia.

In sentence (a), a shallow information extraction system might infer wrongly that Microsoft's headquarters was located in Georgia. While as humans, we understand from the sentence that Microsoft office was never in Georgia.

b) The National Institute for Psychology in Israel was established in May 1971 as the Israel Center for Psychobiology by Prof. Joel.

In sentence (b), a shallow system could wrongly infer that Israel was established in May 1971. Humans know that it is the National Institute for Psychobiology that was established in 1971.
In summary of the comparison between deep and shallow language processing, deep linguistic processing provides a knowledge-rich analysis of language through manually developed grammars and language resources. Whereas, shallow linguistic processing provides a knowledge-lean analysis of language through statistical/machine learning manipulation of texts and/or annotated linguistic resource.

Sub-communities

"Deep" computational linguists are divided in different sub-communities based on the grammatical formalism they adopted for deep linguistic processing. The major sub-communities includes the:

  • DEep Linguistic Processing with HPSG - INitiative (DELPH-IN) collaboration working with the HPSG formalism. The HPSG Conference is the central conference to share knowledge/advancement of HPSG based deep processing.
  • ParGram/ParSem is international collaboration on LFG-based grammar and semantics development. The LFG Conference is the central conference to share knowledge/advancement of LFG based deep processing.
  • XTAG Research group working with the TAG formalism. The TAG+ conference is the central conference to share knowledge/advancement of TAG based deep processing.

The shortlist above is not exhaustively representative of all the communities working on deep linguistic processing.

See also

References

  1. Timothy Baldwin, Mark Dras, Julia Hockenmaier, Tracy Holloway King, and Gertjan van Noord. 2007. The impact of deep linguistic processing on parsing technology. In Proc. of the 10th International Workshop on Parsing Technologies (IWPT-2007), pages 36–8, Prague, Czech Republic.
  2. Ulrich Callmeier. PET – A platform for experimentation with efficient HPSG processing techniques. Natural Language Engineering, 6(1):99 – 108, 2000.
  3. Hans Uszkoreit. New Chances for Deep Linguistic Processing Archived 2005-11-03 at the Wayback Machine. In Proceedings of COLING 2002, pages xiv–xxvii, Taipei, Taiwan, 2002.
  4. U. Schafer. 2007. ¨ Integrating Deep and Shallow Natural Language Processing Components – Representations and Hybrid Architectures. Ph.D. thesis, Faculty of Mathematics and Computer Science, Saarland University, Saarbrucken, Germany.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.