1.3   Computers and Critics Continued

1.3.2 Concepts of Structure

Computer-assisted criticism, by virtue of its method, views the text as a material sequence of symbols, and therefore may employ linguistic models when appropriate. In addition, it may develop or adopt non-linguistic models as it deems necessary and justified. These computer models of text have usually been of two different kinds.
The first are those that describe or predict patterns along the 'horizontal' or textual axis. These might involve examining the usage of words in general, or certain words in particular. Other examples might be studies of sentence or soliloquy length, unless these have been defined as higher categories.
The second kind are those that deal with patterns among the parallel 'vertical' strata of the higher and more abstract categories.

Typical examples here might be the examination of words defined as images, the higher groupings of these called themes, study of grammatical patterns, and so on. These are only some of the thousands of possible secondary strata. It should also be noted that the size of the portion of text being studied will influence the context within which the patterns are interpreted. If the focus is on the sentence, the context would probably be linguistic, whereas if the focus is on the speeches of characters, the context might be one of discourse analysis.

For each of these ways of looking at the text, computer-assisted criticism can do two distinct things. Firstly, it can describe the form or behaviour of the area of investigation over the entire segment of text being studied, or over small portions of it. This can give valuable insights into its prevalence, and therefore possible importance. It can also reveal variations in the distribution of the feature throughout the text, and hence point to areas for more intensive study. In the case of a thematic study, where a theme is described in terms of a collection of semantically related words, such a behavioural investigation might plot in some graphic form the occurrences of specific words throughout the text.

Secondly, computer-assisted criticism can employ models that characterize that distribution. Such a model would go further than mere description in perhaps uncovering the underlying dynamics of the distribution. This would then enable the easy comparison of any text with other similarly analyzed text, as regards distribution of the given feature. It should be noted that any feature that can be defined in precise, functional terms may be selected for such analysis.

Smith lists a number of possibilities for the analysis itself in his discussion. Fourier Analysis 16 is a mathematical technique usually applied to the analysis, and generation, of complex waveforms. However, it can also be used to examine the distribution curves that describe a particular feature. This technique can significantly enhance the critic's ability to detect fluctuations in the feature. A technique that can be successfully used to examine co-occurrences of various kinds is Factor Analysis 17 . This determines specific clusters or groups of features that consistently occur in close proximity.

State Diagrams 18 can be used to trace developing networks of associations between features, or across different categories. Finally, there is Smith's own CGAMS 19 . This derives its title from his doctoral work on Joyce's A Portrait of the Artist, and stands for Computer Generated Analogues of Mental Structure. It provides a contour map, or three-dimensional picture of the relationship between various themes in a text. These models can in themselves create higher strata for investigation. For instance, CGAMS might indicate a cluster of themes which deserve to be treated as a separate entity, a hypertheme. These then become a higher level of structure themselves, and thus are susceptible to analysis. All of these techniques, and numerous others besides, offer possibilities that most critics have not even begun to explore. Their possible applications are only limited by one's imagination.

Most of the schools of conventional criticism discussed in the previous section have used a stratified view of text as the basis for their concept of form or structure. One major contribution of the Prague Structuralists is contained in the Jakobson/Levi-Strauss study mentioned earlier. While it had often been suggested that it should be possible to analyze literary works in terms of relations within and among a number of literary strata, this was the first full-length attempt to analyze a complete, albeit small, work. The Prague Structuralists have also been active in the area of establishing normative patterns against which specific stylistic deviations can be measured. One member of the Prague School, Lubomir Dolezel, has proposed 20 that the correct starting point for any such study should lie in the collection of a large number of statistical measures for a variety of texts. Clearly, only the use of the computer would allow the completion of such a monumental task within one person's lifetime.

For the French Structuralists the notion of structure has normally centered on linguistic structure, but there are a few notable exceptions. For Barthes, structure means primarily "patterns of recurrence and association" 21 , distinguishing the ordering of units in a text from that created by pure chance. The best known application of Barthes' concept of structure is his S/Z 22 . Here he divides a short story by Balzac, Sarrasine, into 561 'lexies' along the horizontal axis. Each lexie is the smallest portion of text that in Barthes' opinion carries 'meaning'. He then factors this meaning into five vertical 'codes'; unfortunately, the codes are highly idiosyncratic and completely subjective. Barthes makes no attempt at universality, or even possible wider applicability, and so this remains a necessarily isolated study.

The Formalist model of levels within language, and the relations between these levels, was originally conceived by Firth, and extended by M.A.K. Halliday. It has been has been applied strongly by Halliday for literary and stylistic analysis to limited (though still sizable) samples of Golding's The Inheritors 23 . Here, he establishes syntactic collocations and shows that these constitute the growing awareness of the central character.

A central aspect of all of these schools' concept of structure is the notion of a horizontal material text, over which various abstract strata are projected. To describe relations along and across these strata the critics have relied primarily on the concepts of the paradigm and the transformation. The computer can accommodate both the stratified perspectives of these groups, and the relational models they have employed. It can also extend their work by providing the ability to add new strata or concepts as required, and to apply it all to the whole of large texts quickly and easily. This ability to handle large amounts of text quickly offers the possibility of verifying structuralist theories using a realistic corpus. The computer's very generality enables it to accommodate a variety of points of view, rather than being bound to any one school.

For all these reasons the computer would make a useful and compatible methodological adjunct to conventional Structuralist/Formalist perspectives on literature, providing the potential for testing and validating many theories which at the moment remain simply that - theories.

Previous Next

[ Skip to Next Chapter ] [ Table of Contents ] [ E-mail Author ]

©Andrew Treloar, 2017. W: http://andrew.treloar.net/ E: andrew.treloar@gmail.com

Last modified: Monday, 11-Dec-2017 14:42:15 AEDT