Sunday, April 5, 2015

Could the Failure of 19th and 20th cent. TC be because of Taxonomic Approaches?

A remarkable post by


...
Jan suggested to work with two dimensions in the classification: problems and causes. I immediately knew that was it! But how does that work, a classification with more than one dimension? I started to study the theory of classification, and I realized I had always been restricting myself to a certain form of classification, namely a taxonomy.


In a taxonomy, an object can occupy only one place in a hierarchical system: classifying a dog in a taxonomy of animals means positioning it at one of the branches of a tree, by means of characterizing it according to certain variables which are considered in sequence. 
However, there is also a more complex form of classification: a typology. An example of a typology would be the characterization of a group of people according to their gender as well as to the colour of their hair. Each individual is not positioned within a hierarchical structure, as in a taxonomy, but characterised according to two variables that are considered in parallel, instead of in sequence. 

We needed a typology! The argumentation for each conjecture necessarily has two dimensions, the detection of a problem (in the transmitted text) and the suggestion of a cause of the supposed corruption (that is, a certain type of scribal error/change). ..."

 Kamphuis' discovery parallels several other problems in both the organization of data and the display of data in NT studies.

Consider first of all the problem of grouping manuscripts.  Lately, researchers have been trying two, three and even multidimensional systems for graphing and measuring the 'closeness' of one manuscript's text to another, looking for 'clusters' or groups that have some substantial objective measure.

Example: Willker's Principal Components Analysis


Secondly, we often want to display relationships that are in fact quite complex, but best comprehended in 3-dimensional or 2-dimensional charts, which often must either leave out 'dimensions' of a problem or else distort them.

Consider for instance, a Synoptic relationship diagram, such as this:


Already we can see that certain details are left out or simplified (e.g. "other sources").


Or again, our own experience in trying to give an informative chart of the transmission data for a mere 12 verses of gospel (John 8:1-11):



But I'd like to draw attention to the specific fact that almost all "Evolutionary thinking" in the 19th and 20th centuries was based on the "Taxonomy Paradigm", and that, bluntly stated means the 'experts' were committed to a form of "One-Dimensional" sequential thinking, and viewpoints:  it was the only 'science' they had available at the time.

Perhaps this fundamental commitment to contemporary "science" as they understood it, forced them to abandon even 'common sense' in regard to the data regarding (accidental) omissions in ancient manuscripts, and embrace the only 'scientific' methodologies available, namely taxonomy-style approachs.

Could this have contributed to the widespread and large-scale 'blindness' regarding the majority of homoioteleuton omissions in the most ancient Uncial manuscripts and texts, and the almost mechanical and irrationally stubborn embrace of the "Prefer the Shorter Reading" axiom?

Kamphuis tells us also of the experience of  20/20 hindsight we all can relate to:
"But again and again some conjecture popped up that posed a problem and called for an adjustment of categories or definitions. Interestingly, most of the time such adjustments made the classification more straightforward, often making me wonder why that didn't occur to me earlier. ..."
 If early Textual Critics were given another chance at reconstructing the NT text, would they be able to adapt and embrace the more modern and multi-dimensional view of today, and reassess the crude and (in hindsight) misleading 'guidelines' of Textual Criticism of the 19th century?

Would they (unlike their modern ideological successors) recant and embrace the common (traditional 'Majority') Koine text found in the bulk of manuscripts extant today, representing multitudinous lines of transmission?

Would they abandon the "shortest text" in favour of the most likely text?




Wednesday, February 25, 2015

Homoioteleuton in Enoch and Scholarly Use of Scribal Tendencies

 
 
It is both enlightening and remarkable,
that when scholars examine other texts,
which are not considered "Holy Scripture" or "Divinely Preserved",
such as the Book(s) of Enoch,
the exact same problems and habitual copyist errors occur,
and these are just as easily and confidently identified,
based on the same probabilities.


In other words, the common errors of OMISSION,
usually caused by simple fatigue, in which a copyist
loses his place, and more often than not skips a line unnoticed,
are just as frequent in these other documents.

Such consistent and 'reliable' copyist errors form the basis
of all textual reconstruction of non-Biblical, classical, and secular texts.


So why didn't 19th century Textual Criticism apply this knowledge
in exactly the same way when reconstructing the New Testament text?


The answer sadly, isn't because of the 'special habits' of Christian copyists,
or unforseen and mysterious processes from which classical texts escape,
nor can it be explained by 'deliberate tampering' or other conjectures.

The sad fact is, when all the errors of omission are taken in total,
its obvious that together they comprise of a large body of ACCIDENTAL variants,
and no 'systemic' trend or trait can be demonstrated.

Neither is there any 'systemic' bias or editing or other tampering involved.

While some passages containing important doctrines were sensitive to errors,
and this created suspicion among both copyists and 'Editors' like Jerome,
the fact remains that even taking all the omissions and mistakes into the text,
it remains exactly the same group of documents it was before:
It teaches in the main the same doctrines, presents the same history,
makes the same miraculous claims, and inspires the same religion.

One cannot for instance say that the majority of errors were "Arian",
or "Sabellian" or "Gnostic" in slant, nor can anyone make a claim that
all the errors are 'Roman Catholic' or based on superstitious beliefs.

These textual variants remain random in their impact as a group,
and the most likely explanation for the entire group is simple accident,
for the most part errors by omission due to the eye skipping a line or
skipping over a similarly ending pair of words in a line of text.

Thus R.H. Charles in his characterization of the extant surviving manuscripts
for the Book of Enoch, was able to categorize the variants mainly as
homoioteleuton-style omissions, and not scribal creativity.

Similarly, Knibb many years later made the same insightful observations:

In describing the Akhmim Manuscript (Codex Panopolitanus) of Enoch,
he states:

"Amongst the many mistakes in the manuscript particular attention - so far as this edition of Enoch is concerned, - should be drawn to the existence of numerous omissions, many through homoioteleuton..."
- The Ethiopic Book of Enoch, Michael Knibb, p. 17,(1978, Oxford Press)



Contrary to claims of "scientific Textual Criticism",
the real reason that editors have unilaterally dismissed longer readings
in favour of shorter ones, was not a knowledge of scribal habits,
but rather a prejudice against the text so great that it overwhelmed all
reasonable judgement in regard to the actual evidence.

The Critics and Editors were LOOKING for the shortest possible text,
to ELIMINATE any and all texts supporting miracles, incarnations,
and other obvious Christian doctrines they had already regarded as
suspicious
, superstition-based, contrary to 19th century materialism,
and from their view encumbering the text with supposed 'superstitions'
and legends, mythology which had accrued over centuries of collecting,
through marginal comments and imaginative conjectural emendation.

However, the evidence of the textual tradition and transmission itself
supports no such process
. The stories of Jesus and the teachings of Paul
were exactly as they are from the start.
The only accretions and 'editing' must have taken place within a few 100 years of Jesus' time.

These were in the main things like the gathering together of the Paul's letters into
a single document, and the rewriting of the gospel of Mark to include more teachings
and sayings of Jesus.

Even the most extremely edited and shortened text
is still basically a New Testament of nominally Christian content

complete with miracles, and in spite of the best efforts
of skeptical scholars, the New Testament remains a non-denominational Christian handbook.