Gaelic development in Scotland and problems of capacity

BAAL 2010 Aberdeen

Book of Abstracts: Draft mid-August 2010

CONTENTS

Plenary Papers 3

colloquia 7

sig track papers 24

Corpus linguistics 25

gender and language 28

language in africa 31

language learning and teaching 34

psycholinguistics 37

testing, evaluation and assessment 39

uk linguistic ethnography forum 41

individual papers 43

posters 108

list of contributors 121

PLENARY PAPERS

Gaelic development in Scotland and problems of capacity

Wilson McLeod

Over the course of the last decade, efforts to sustain and develop the Gaelic language in Scotland have entered a new phase, partly as a result of the opening of the Scottish Parliament in 1999. Key developments include the passage of the Gaelic Language (Scotland) Act 2005, which, among other things, requires public bodies to put in place Gaelic language plans, and the establishment of the dedicated Gaelic television service BBC ALBA in 2008.

It has become increasingly questionable, however, whether the Gaelic language community and indeed the language itself have the capacity to allow for successful implementation of these initiatives. This problem of capacity is apparent in a number of different ways.

First, there is a disparity between the extreme minoritisation of the Gaelic language in Scotland – spoken by just 59,000 people, 1.2% of the national population – and policy measures which aim to institutionalise its new status as a official language. Gaelic is much more marginal in Scotland than other languages for which such strategies have been adopted, such as Welsh and Basque.

Second, although teacher shortages have been a long-standing problem, there are increasing difficulties in recruiting staff with the necessary skills to take on the broader range work associated with Gaelic language development, including key leadership posts for the Gaelic sector.

Third, the long-standing inattention to corpus planning in Gaelic (arguably unusual in the European minority language context) is becoming increasingly problematic as the language is pushed into new areas of use. There are, for examples, no structures for the training or certification of translators and interpreters, and the print media are contracting rather than expanding.

Imagined identities, grassroots literacy, and digital resources
Bonny Norton
In previous work, I have made the case that the conditions under which language learners speak, read, or write is best explained with reference to the multiple identities they negotiate in classrooms and communities. This presentation explores how, and to what extent, developments in digital technology impact the imagined identities of language learners and teachers in poorly resourced communities, such as rural Uganda. The central argument made is that despite the formal context of schooling, the conditions under which these learners and teachers invest in multilingual language and literacy practices approximates what Blommaert calls ‘grassroots literacy’ i.e. literacy ‘performed by people who are not fully inserted into elite economies of information, language, and literacy.’ (2008, p. 7). I draw on extensive research in the African context to make the case that unique innovations in digital technology offer a range of opportunities for language learners and teachers in poorly resourced communities to claim identities as global citizens.

Language, context and the locality of the local

Alastair Pennycook

In this paper I shall argue the need to rethink the relation between the global and local. All too often these are juxtaposed and mapped against the big and the small, new and old, international and regional, modern and traditional, mobile and static. On the one hand, the local may be downplayed as the poor cousin of the global; on the other, it may be upheld as all that needs to be preserved and praised as the locus of diversity.  There are several problems with this: First, we need to consider the basis of a model that equates homogeny with the global and heterogeny with the local. Second, we need to consider that the two terms sit in a very complex relation to each other: Everything that happens happens locally; everything that happens globally also happens locally. This relation is not helped by terminology such as ‘glocal’, since this simply elides the two ideas. And third, without a thorough exploration not only of globalization but also of localization (and relocalization in relation to recontextualization and resemioticization), the two terms cannot capture the dynamic of change. The idea of the local needs to go beyond an equivalence with context in order to take on board dimensions of place. If we accept that English, for example, is a global language (it is extensively used and closely linked to other forms of globalization), its manifestation as a local language is all too often described in terms of regionally stable new varieties rather than dynamic processes of relocalization. This paper draws on a number of different contexts of language use, from metrolingualism to early literacy, to ask  what it means to ‘talk like a local.’   

Crimes against humanity in education, and applied linguistics - corporate globalisation or geopolitical knowledge glocalisation?

Tove Skutnabb-Kangas

Lately, in several workshops, conferences and reports western-schooled representatives of various sciences (including medical doctors), and representatives of traditional knowledges (including shamans and ‘medicine men/women’), have tried to discuss, debate and clarify the relative validity and reliability of their respective knowledges.

An important precaution is to admit that all traditional knowledge is not valid and may be based on myths, ideology and status competitions. It is equally important to admit that some western-based knowledge can be just as invalid. Today, corporate globalisation, for instance competition between pharmaceutical multinationals, creates myths and ideologies, in addition to the knowledge mining, attempts at patenting various aspects of life and other types of exploitation that they are guilty of. Much of the knowledge exploited can be centuries old, and it is often more sophisticated and uses more nuanced categories than much of western science, a fact that has been accepted by ICSU, the International Council for Science () in a 2002 report.

This leads us to questions of scientific imperialism. In postcolonial theory and analysis of the purpose of bilingual and intercultural education for Indigenous and tribal peoples, according to Susanne Pérez (2009), the issues of not only which language should be the medium but what should be taught in the Indigenous education, in, for instance, physics classes,

should not be reduced to a ‘technical’ question of finding the best indigenous word for ‘cell’ or ‘atmosphere’, but requires discussion of the ideological implications when it is assumed that the introduction of what counts as academic knowledge, reasoning and ‘truths’ is good.

Indigenous peoples, anthropologists and others have questioned this truth, but their efforts were branded as ‘ethno-academic’, for instance ethno-mathematics, ethno-biology, ethno-medicine, and ethno-astronomy. But why is some knowledge classified as ‘ethnic’ in contrast to ‘pure’ knowledge, as in ‘pure mathematics’? ‘Nowadays, ‘ethno-’ is used in a quite liberal way (…), in order to indicate that the investigation of a particular field of study (as biology or astronomy), is made from the perspective of and based on the knowledge of a ‘traditional’ non-occidental society’ (Urton 2003: 21).

By classifying non-occidental knowledge as ‘traditional’ or ‘local wisdom’, it is fixed in time and space:

At the same time, [concepts such as] ‘abstract’, ‘neutral’, ‘pure science’ or ‘universal knowledge’ hide the fact that all knowledge is produced by somebody, at a certain time in history and at a certain place in history. By defining academic knowledge as time- and spaceless, Western scientists are trying to hide their own philosophical foundations (Urton 2003: 21).

‘[T]he ‘history’ of knowledge is marked geo-historically, geo-politically and geo-culturally; it has a value, colour and a place ‘of origin’ ’ (Walsh 2004: 2). Thus when indigenous epistemologies, philosophies and ways of ‘doing science’ are questioned and reduced to ‘local wisdom’, or ‘ethno-sciences’ by occidental scientists, they are actually reproducing colonial and neocolonial power relations. It is a colonisation of knowledge. Access to occidental academic knowledge is presented as access to the ‘modern world’ and ‘development’, which ultimately reproduces the bonds of colonialism’ (Pérez 2009: 213).

These questions have been discussed among Indigenous peoples and non-western scientists for a long time. But it seems that these discussions in many ‘mainstream’ (another loaded term) subfields of applied linguistics (e.g. ESL or bilingual education) are either at their very beginning or, if they have been part of the discourse, they have not changed much in the ways these fields act or even see themselves. With support from such fields, linguistic genocide in education and crimes against humanity are still being committed, with the perpetrators in most cases not even being aware of how what they are legitimating or doing might be labelled.

The paper will discuss some of these issues of scientific and other neo-imperialism within some subfields of applied linguistics in a holistic way, with arguments from education, sociology, human rights law, biodiversity studies, and studies about the maintenance of endangered languages as living languages (see Skutnabb-Kangas & Dunbar 2010).

References:

Pérez, Susanne Jacobsen (2009). ‘The contribution of postcolonial theory to intercultural bilingual education in Peru: an indigenous teacher training programme’. In Skutnabb-Kangas, Tove, Phillipson, Robert, Mohanty, Ajit and Panda, Minati (eds). Social Justice through Multilingual Education. Bristol: Multilingual Matters, 201-219.

Skutnabb-Kangas, Tove and Dunbar, Robert (2010). Indigenous children’s education as linguistic genocide and a crime against humanity? a global view. Gáldu Čála. Journal of Indigenous Peoples' Rights 1, 2010. Guovdageaidnu/Kautokeino: Galdu, Resource Centre for the Rights of Indigenous Peoples (). read it as an e-book, free of charge, at http://www.e-pages.dk/grusweb/55/.

COLLOQUIA

C1

Vocabulary Studies SIG Colloquium

Colloquium Convenor(s): James Milton, Swansea University

Vocabulary Studies have enjoyed something of a renaissance in the last 20 years after a period when the subject was downplayed in importance and rather neglected. Perhaps as a result of this, much of the work we do now draws on studies not just from the recent past but from considerably earlier; the time before structuralist approaches to language teaching effectively side-lined the subject. For example, the work done on lexical sophistication and lexical diversity in the analysis of written texts, which still excites a considerable literature through from TTR to Malvern and Richard’s D, draws extensively on the ideas of George Udny Yule in the 1940s. Another example would be West’s General Service Word List, which was developed in the 1930s, and which has laid the foundation for so much modern work using corpora and is still used, for example, in the development of Coxhead’s Academic Word List and Cobb’s vocabprofile. Much of this material is now quite hard to access. Even comparatively recent papers, such as Laufer’s work which gives rise to the 95% coverage figure required for comprehension, and which is hugely cited and influential, can be quite hard to track down in the original form. Sometimes these works, quite undeservedly, disappear from view. Others develop a life of their own, as with Laufer’s 95% coverage figure which is often over-generalised and misinterpreted.

The intention of this colloquium is to allow researchers in the field of vocabulary studies to propose papers which they think have been particularly important in the development of the subject and in shaping the study of the subject today. They will describe their chosen work and describe why they think their choice is so important. Those attending the colloquium will be given the chance to judge the proposals and indicate which they think are truly important. We are expecting some lively debate on the choices. We have interest from a publisher in an Essential Readings in Vocabulary Studies book which is to be developed from this colloquium.

The format will therefore be a series of short papers (5 – 10 minutes only) without a break for discussion between individual papers, followed by a voting system and general discussion.

Paper 1

James Milton, Swansea University

Batia Laufer (1989) What percentage of text-lexis is essential for comprehension? In Lauren, C. and Nordman, M. (eds) Special Language; from Humans Thinking to Thinking Machines. Cleveden; Multilingual Matters, 316-323.

This paper has proved to be highly influential in establishing in the minds of learners and teachers the idea that coverage and comprehension are inextricably mixed and a certain level of coverage is required before a text is fully comprehended. The figure which Laufer quotes that, ‘above 95% coverage it becomes possible to read without using a dictionary’ has formed the basis of all sorts of studies which seek to define whether learner’s vocabulary levels are adequate for reading tasks. It is widely cited probably because it contains a truth that we all recognise that in order to understand a written text you probably need to understand pretty much all the words in it. But the paper is often considerably misinterpreted and seems to contradict itself as to what it means by full comprehension. Laufer appears to mean comprehension for B2 level learners taking exams at that level, for example, and this may be some way away from full comprehension as educated native speakers might understand it. Laufer’s conclusion that 95% coverage requires about 3000 word knowledge is also unclear given the unit of count. It is a highly influential paper but one that probably requires more sophisticated interpretation than it is usually given.

Paper 2

Brian Richards, University of Reading

Weizman, Z. O., & Snow, C. E. (2001). Lexical input as related to children's vocabulary acquisition: Effects of sophisticated exposure and support for meaning. Developmental Psychology, 37, 265-279.

Fifty-three children of low-income families in the United States were studied as part of a longitudinal study of language, literacy and educational achievement in disadvantaged children. In this investigation into vocabulary development, vocabulary input and the associated interaction in the home was assessed at five years and related to PPVT scores for receptive vocabulary at the end of kindergarten and second grade. Particular emphasis was placed on the role of ‘sophisticated’ lexis (items outside 3000 common words known by most fourth graders) in the input across five separate contexts (e.g. story book, toy play, etc). In fact, less than 2% of maternal vocabulary was low frequency, but its occurrence was more strongly related to children’s later vocabulary scores than the amount of vocabulary input in total.

Both the density of sophisticated lexis and the degree of informativeness of the interaction in conveying the meaning of a sophisticated work independently predicted more than a third of the variance in children’s vocabulary in later assessments. And, after controls for child non-verbal IQ, quantity of child talk and maternal education, they jointly predicted half the variance at the end of second grade.

Paper 3

Katja Mantija, University of Jyväskylä

Håkan Ringbom's article Crosslinguistic lexical influence and foreign language learning (1991) addresses two important questions: what does knowing a word entail, and how does previous vocabulary knowledge affect vocabulary acquisition?

Often, when knowing a word is mentioned, a simple list including, e.g., spoken and written form, meaning, collocations, associations, syntax, and frequency is given. However, knowing a word is not that black and white. Rather, various aspects of vocabulary knowledge develop gradually, and a language user may, for instance, be familiar with all possible meanings of a word, know some of its inflected forms, but have no collocational knowledge. Ringbom’s article presents a model of lexical knowledge that takes this gradual nature of lexical knowledge into account, and expresses it clearly and explicitly. Also, the article gives a concise overview of cross-linguistic lexical influences, with several illustrative examples. Although there is an amount of recent research on cross-linguistic effects, the points Ringbom makes are still valid today. He discusses not only the effect of L1 on L2, but also how existing L2 vocabulary knowledge influences acquiring new vocabulary in L2. Ringbom’s examples reflect also on differences between language families as he has studied English produced by Swedish vs. Finnish L1 speakers, the first being a Germanic language and the latter belonging to a different language family.

Paper 4

Jeanine Treffers-Daller, UWE Bristol

Jack Richards (1976) The role of vocabulary teaching. TESOL Quarterly 10(1): 77-89.

In this paper I will review Jack Richards’ (1976) seminal paper, which has been and still is very influential in the field. Although some authors assumed a multidimensional model of word knowledge very early onwards (e.g. Cronbach 1942), Richards was probably one of the first to summarise the many dimensions of word knowledge in a systematic way. His list of eight assumptions regarding vocabulary knowledge shows he was far ahead of others at the time in pointing out that knowing a word entails much more than a form-meaning mapping. L1 users also possess knowledge about the frequency of items, collocations in which they occur, sociolinguistic variation in their use, a word’s syntactic and morphological properties and the networks of associations and meaning relations between words. I will also briefly review how our understanding of what it means to know a word has developed since Richards’ paper and how the changing views on this matter have influenced the ways in which we measure or assess this knowledge.

Paper 5

Huw Bell, MMU

Ure, J. 1971. Lexical density and register differentiation. In G.E. Perren and J.L.M. Trim (eds.) Applications of linguistics: selected papers of the Second International Congress of Applied Linguistics, Cambridge 1969. Cambridge: Cambridge University Press, 443-452.

Ure’s 1971 paper is a ‘great moment’ in vocabulary studies, something of a breakthrough. Although it made use of an existing distinction between ‘lexical’ and ‘content’ words, it was the first to clearly define the concept of Lexical Density. Working with two corpora of around 20,000 words, and using a relatively simple formula based on dividing the number of lexical words by the number of tokens, Ure calculated the percentage of lexical words for a series of texts taken from a variety of spoken and written sources, and from a variety of discourse types. The results showed that Lexical Density varied systematically and widely. The paper continues to be important for a number of reasons. It gives a strong first empirical account of the importance of genre and the ‘social function’ of text. It has enjoyed considerable longevity, and feeds a continuing stream of research across the L1 / L2 divide. Ure’s formula, unlike many others, possesses formal grace and intuitive appeal. It has also been employed by researchers in a surprisingly wide variety of applications, including assessment studies, genre analysis, corpus work and discourse analysis. It also illustrates and typifies a number of unresolved issues in lexical measurement. This paper will consider these issues in more depth and consider some possible futures for Lexical Density.

Paper 6

Imma Miralpaix, University of Barcelona

Ellegård, A. (1960). Estimating vocabulary size. Word 16: 219-244.

Ellegård (1960) addresses a number of issues that are fundamental to better understanding frequency of word occurrence in different texts and how this measure could help estimate the amount of vocabulary known by a particular writer. It also approaches the topic in quite innovative ways: some of the ideas put forward in this work have become common procedures in vocabulary research. The article starts by first estimating the vocabulary likely to be found in texts of various lengths and topics. The author predicts the contribution of words from different frequency ranges in the language as a whole or in any ‘theoretically mixed samples’. Secondly, he also compares the vocabulary of actual texts with each other. With the information obtained, it is possible to study the relationship between ‘theoretical’ and ‘observed’ vocabulary values and therefore to estimate the vocabulary of an individual. Until then, the size of the vocabulary of a person tested was mostly determined by the size of the dictionary from which the test sample was taken. Ellegård points out that instead of trying to estimate an individual’s vocabulary in relation to the language as a whole or in relation to a particular dictionary, it could be more properly estimated in relation to the words in the language within a definite frequency range, which would be computed from counts based on methodologically selected materials. Other interesting points of the article are the use of techniques such as profiling, or the application of the Poisson formula or Zipf’s Law to prove some of his arguments. He also problematises the established notion of ‘word’, making a distinction between ‘lexical unit’, ‘word form’ and ‘semantic element’ and checking them against different languages.

Paper 7

Michael Daller, UWE Bristol

Guiraud, P. (1954). Les caractères statistique du vocabulaire. Presses universitaires de France.

Vocabulary measures based on the ratio between types and tokens face a fundamental methodological problem because the type token ratio (TTR) systematically decreases with increasing text length since the speakers/writers run out of new words the longer they speak or write. In 1915 Thomson and Thompson wanted to compare the writing vocabulary size of school children and sought to address this issue in their work but seem to have been interrupted by the First World War. A number of modern approaches have been suggested to overcome this methodological problem (see Daller, Milton and Treffers-Daller 2007) but several researchers were aware of this issue already much earlier. An approach that makes reference to Thomson and Thompson (1915) is presented by Guiraud (1954). Based on an analysis of vocabulary in literary texts and dictionaries Guiraud proposes that an index ‘R’ (Richesse de vocabulaire) is constant over text length up to 100.000 words (Guiraud 1954: 52). Guiraud was also aware of the problem of what counts as a single word. He takes a quite modern approach towards multi-word units. He suggests that not only verbs and their auxiliaries should be counted as one word but also expressions that are used as a whole, e.g. expressions such as ‘par conséquent’ (1954: 20). This approach fits nicely with recent work on formulaic sequences, and it can give new directions for future developments of measures of lexical richness.

Paper 8

Norbert Schmitt, Nottingham University

Nation, I.S.P. (2006) How large a vocabulary is needed for reading and listening? The Canadian Modern Language Review. 63, 1 (September / septembre 2006), 59-82.

The estimates of vocabulary size required to use English receptively and productively used to range from 2,000-5,000 word families. However, in 2006, Paul Nation recalculated the requirements using more up-to-date BNC corpus data and a 98% coverage criterion, and found that requirements were much higher: 6,000-7,000 word families for spoken discourse and 8,000-9,000 word families for written discourse. This indicates that learners must reach much higher vocabulary sizes than thought before, and has major implications for the way language and vocabulary teaching is approached.

Paper 9: Word recognition in an L2: The contribution of J. Cattell.

Paul Meara, Swansea University

This presentation will deal with a paper published by Cattell in 1898. Despite the early date of this publication, Cattell managed to establish a number of critical features about the way L2 speakers deal with words. Although his timing apparatus was essentially driven by clockwork, his recognition time data was incredibly accurate, and remained largely unchallenged for 70 years. The talk will show a working model of Cattell's apparatus, and consider some of the implications of the way technology drives research (or not).

C2

Applied Linguistics in Intercultural Communication: Current Perspectives and Approaches

Colloquium Convenor(s): Tony Young, Newcastle University (tony.young@ncl.ac.uk), Jane Woodin, Sheffield University (j.woodin@sheffield.ac.uk)

This colloquium is the first contribution of the new Intercultural Communication (IC) SIG to BAAL Annual Meetings. The SIG aims to bring together researchers and practitioners whose interests intersect both fields.

Invitees to the colloquium were asked to consider how applied linguistics can make a significant contribution to the multidisciplinary field of intercultural communication. Suggested focus questions to address were:

  • To what extent is it possible to apply three identifiable but competing paradigms in intercultural communication: the social science, interpretive and critical approach to research and practice in intercultural communication pedagogy?

  • How can we study groups without a priori categorisation? Is essentialism essential in IC research?

  • To what extent is multimodality and/or mixed methodology desirable, possible or necessary in IC research?

  • What are the implications of non-western perspectives on human communication for theory and practice in IC?

In addressing these and related questions, this colloquium aims to showcase and open up for discussion the range of conceptual, methodological and pedagogical perspectives and approaches which exist under the ambit of our SIG.

The colloquium consists of 6 papers and a roundtable discussion of matters arising involving presenters and the audience.

  1. An analysis of business angels in scotland and poland

    Документ
    Business Angels are private individuals (or syndicates) who supply venture capital to businesses, mainly small start up firms, in an informal investment market setting.
  2. A short biographical dictionary of english

    Документ
    -704). —Historian, b. in Donegal, became Abbot of Iona in 79. Like other Irish churchmen he was a statesman as well as an ecclesiastic, and appears to have been sent on various political missions. In the great controversy on the subject
  3. Tionhood of Indigenous Peoples. Bergen: Universitetsforlaget, pp. 329-339. Disease, EPIDEMIC Disease, america, indian, eskimo, inuit, [curr-anthropol-1994-35: 59

    Документ
    Aaby, Peter 1984 Epidemics Among Amerindians and Inuit: A Preliminary Interpretation. In: Jens Brosted, et al., eds. Native Power: The Quest for Autonomy and Nationhood of Indigenous Peoples.
  4. A review of digitisation projects in local authority libraries & archives

    Документ
    This Review seeks to address a key issue in the Library and Information Commission’s 20/20 Vision statement: “to support the enabling of a digital library collection in which the UK s heritage of intellectual property will be available
  5. Psychological trends in retrospect and prospect: Invited symposium of iupsyS past-presidents

    Документ
    Notice: This abstract book is distributed among the participants of the 28th International Congress of Psychology. The registered participants paper abstracts will be published in a special issue of the International Journal of Psychology
  6. 5. Rationale and Purpose of the Module

    Документ
    5. Rationale and Purpose of the Module The aim of the module is to introduce students to the basic techniques, language and principles of management accounting.
  7. Title: Information flow analysis and consistency of multilevel os and dbms model

    Документ
    Author affiliation:(1) School of Information Security Engineering, Shanghai Jiaotong University, Shanghai 2 30, China; (2) Key Laboratory of Integrate Administration Technologies for Information Security,
  8. Rear View Vol. I was written to provide an autobiographical resume of my family background and some of the more eventful and influential events in my life, fro

    Документ
    Rear View Vol. I was written to provide an autobiographical resume of my family background and some of the more eventful and influential events in my life, from birth to the end of WWII, plus some occasional accounts of activities
  9. As the last hours of the old century were coming to a close, millions of people gathered around the globe to usher in the “New Era” of the wonderful and excitin

    Документ
    As the last hours of the old century were coming to a close, millions of people gathered around the globe to usher in the “New Era” of the wonderful and exciting Twentieth century.

Другие похожие документы..