Contents
- Presidents Welcome Address
- Local Host Welcome
- Conference Schedule
- Programme Outline
- Abstracts
President’s Welcome Message
It is my great pleasure to welcome you to Exeter for the 2007 meeting of the International Society for the History, Philosophy and Social Studies of Biology (ISHPSSB). We are indeed fortunate to be able to meet on the beautiful campus of the University of Exeter. The Local Arrangements Committee – John Dupré, Professor of Philosophy and Director of the ESRC Centre for Genomics in Society (Egenis) at Exeter, Keith Benson, Jane Calvert, Christine Hauskeller, Staffan Müller-Wille, Hans- Jörg Rheinberger, Ginny Russell and Cheryl Sutton – have provided a warm and accommodating venue for the meeting, including an enticing variety of activities in and around Exeter. I want to thank them very much for all their all of their very hard work over the past two years. I would also like to thank our Program Co-Chairs, Hans-Jörg Rheinberger and Staffan Müller-Wille, and the Program Committee (Carlos Sonnenschein, Anya Plutynski, Christine Hauskeller, Elihu Gerson, Ana Barahona Echeverria , Werner Callebaut, and Carl Craver), who have put together an exciting and wide-ranging program. This promises to be one of the largest and most successful ISHPSSB meetings we have ever had.
For this year’s program, we have over 130 sessions, ranging in topics as diverse as "Human Automatism," "Rebels" in Science, and "Animal Minds", alongside some more familiar topics such as pre- and post-Darwinian evolutionary theory, classical and molecular genetics, and theoretical in biology. In summarizing the various sessions, several characteristics profile the varying interests of our members. It is interesting, for example, that there are only four sessions that deal directly with Darwin and his era, yet a total of 16 focus on twentieth-century evolutionary theory. Other areas that are getting considerable attention include "Evo-devo" (Evolution and Development, 10 sessions), history and philosophy of neurosciences (4 sessions), ecology and environment (5 sessions), the relationship between mechanism and holism as philosophies of biology (6 sessions), and theory/ideology in biology, including "complexity" (8 sessions). There also seems to be a good mix of the three disciplinary areas that represent the Society: history, philosophy and social science. I am extraordinarily pleased to see that ISHPSSB's commitment to diversity of ideas is so alive. Still, we all face the “child in the candy store” dilemma of having to be selective at every time slot. We need to avail ourselves of the service of time-and-space theorists who can figure out a way for each of us to be at multiple sessions simultaneously.
I want to call to the attention of all ISHPSSB members attending the Exeter meeting, to two special sessions. The first is the Thursday opening Welcome Plenary Session, devoted to "Race and Genomics: Old Wine in New Bottles", organized by the Program Committee, with papers by Naju Abu El-Haj, Renato Mazzolini, Jenny Reardon and Sahotra Sarkar. The second is the special session to discuss the future possibility of ISHPSSB taking up the publication of a journal – either hard- copy, on-line or both. This session, which follows the Plenary on Thursday morning, has been organized by the Ad-hoc Committee on Publication. When the Society first contemplated a publication project in 2005 (in an original offer to take over the journal History and Philosophy of the Life Sciences, in partnership with the Stazione Zoologica in Naples) members expressed considerable interest in such an undertaking, so this session will provide an opportunity for free and open discussion of future prospects in the publishing realm.
Enjoy the exciting venues awaiting you at Exeter as we begin our 2007 meeting.
Garland E. Allen President
Welcome from Local Host
On behalf of the local arrangements committee, let me extend a warm welcome to everyone attending the ISHPSSB 2007 summer meeting here in Exeter. We trust that you will enjoy a relaxed but stimulating meeting with plenty of opportunities for interaction both intellectual and social. Mindful of the overriding importance of the latter, the local arrangements committee has made sure that the Holland Hall bar will stay open until 2 a.m. from Thursday through Saturday to facilitate. Chris Young and Keith Benson have generously agreed to sponsor the drinks reception on the first night which will be held at the impressive Imperial pub, just across the road from the campus. This should kick the meeting off in the style in which we hope to continue.
This year’s meeting has attracted over 450 delegates, which gives the promise of many interesting sessions to choose from. We also hope that you will also find time to join some of the trips that we have arranged—or that you will find some time to explore on your own the famous moors, dramatic coasts, or just classic English farmland with which Devon is so generously endowed.. You will quickly discover why this is the part of England which, before cheap flights offered more reliable weather, was where everyone in the UK wanted to take their holidays.
We are looking forward to being your hosts and everyone from Egenis will be happy to help if you have any questions or problems. Just look out for the local host badges.
Finally, we would like to thank the sponsors who have contributed to this meeting, Wetherspoons (owners of the Imperial pub) and Exeter City Council. Their generous contributions, together with the wonderful social traditions of Ishkabibble, will combine to guarantee that the meeting is not only intellectually exciting, but also great fun.
John Dupré
Chair of Local Arrangements Committee
Professor of Philosophy of Science and Director of ESRC Centre for Genomics in Society (Egenis), University of Exeter
Timetable
Wednesday, July 25
14.30-19:00 Registration, Lower Exhibition Hall 19:00- Young Benson Welcome Drinks Reception, The Imperial
Thursday, July 26
09.30-11.00
Late registrations, Lower Exhibition Hall 0900:09:30
Presidential Address, Great Hall
Welcome Plenary Session: Race and Genomics: Old Wine In New Bottles?, Great Hall
11:00-11.30
Coffee break, Lower Exhibition Hall
I. 11.30-13.00
I.1 Exploring Possible Publishing Venues for ISHPSSB (roundtable)
I.2 What Happened to Evolution After the Synthesis?
I.3 Emergence, Reduction and Bioligical Systems
I.4 Gender and Genetics
I.5 Race in Biomedicine and Anthopology I
I.6 Logic of Discovery and Functional Ascriptions in Neuroscience: Bringing History and Current Research Together
I.7 Human Automatism
I.9 Evolution and Communication
I.10 Intersections in the Evo-Devo Juncture I
I.11 Inside and Outside the Laboratory
13:00-14:30
Lunch break, Lower Exhibition Hall Council Meeting I, Peter Chalk I.4
II. 14.30-16.00
II.1 Rebels of Life: Iconoclastic Biologists of the 20th Century I
II.2 Multi-Level Selection and Major Transitions: Groups, Individuals, and the Units of Evolution I
II.3 Exploratory Experimentation in the Life Sciences
II.4 BioOntologies: A New Type of Theory on Biology
II.5 Race in Biomedicine and Anthropology II
II.6 Disease: 17th to 19th Centuries
II.7 The Statistical Roots of Biology
II.8 Theory in Biology
II.9 Explaining Development
II.10 Intersections in the Evo-Devo Juncture II
II.11 Functions I
16:00-16.30 III. 16.30-18.00
Coffee break, Lower Exhibition Hall
Newman B
Newman C
Newman D
Peter Chalk 1.1
Peter Chalk 1.3
Peter Chalk 1.5
Peter Chalk 1.6
Peter Chalk 2.6
Peter Chalk 2.5
Peter Chalk 2.3
Peter Chalk 2.1
III.1 Rebels of Life: Iconoclastic Biologists of the 20th Century II
III.2 Multi-Level Selection and Major Transitions: Groups, Individuals, and the Units of Evolution II
III.3 Conceptual Development in Industrial Contexts: Breeding, Horticulture and Sericulture in the 19th and 20th Centuries
III.4 Changing Perceptions, Changing Cultures: Molecular Biology and the Rise of the Biotech Industry
III.5 Molecular Anthropology Perspectives from History, Philosophy, and Human Geography
III.6 Anagenetic Optimisation Versus Cladogenetic Differentiation
III.7 Varieties of Mechanism
III.8 Communication and Aesthetics
III.9 Development, Inheritance and Evolution
III.10 Botany Between Knowledge and Science: Images, Interspaces, Experiences and Gender
III.11 Functions II
Friday, July 27
VI. 09.30-11.00
IV.1 The Genomic Revolution Revisited I
IV.2 Critically Assessing The Changing Role of the Embryo I (author-meets- critics roundtable)
IV.3 Systems Biology: Computational Models Integrating Evolution, Function, and Design
IV.4 Darwinism in the 21st Century: Beyond the Modern Synthesis I
IV.5 Social, Historical, and Philosophical Perspectives on the Inflation of Gene-related Knowledge
IV.6 Global Food Security and Science Policy: A Canadian Perspective (roundtable)
IV.7 Biomedicine
IV.8 Exploring Unknown Worlds
IV.9 Ethics and Evolution I
IV.10 Selection I
IV.11 Episodes in the History of Speciational Mechanisms
11:00-11.30 V. 11.30-13:00
Coffee break, Lower Exhibition Hall
13:00-14:30
Lunch break, Lower Exhibition Hall
Graduate Student Member General Meeting, Newman B
Lunch roundtable: Genes-> Interpretation-> Impact on Science or Society – What have ISHPSSBers achieved by critiquing genetic analyses (of various kinds)?, Newman C, 13:15
VI. 14.30-16.00
VI.1 Who is Hijacking Systems Biology? The Problem of Multilevel Explanation in Systems Biology
VI.2 Carving Out Action Potentials: Electrophysiology and the Causal Structure of the Nervous System I
VI.3 Models and Experiments
VI.4 Ecology, Environment and Politics
VI.5 The Importance of Homology for Biology and Philosophy I
VI.6 Social and Technological Dimensions of Biology
VI.7 Metaphors in Biology
VI.8 18th- and 19th-Century Biology
VI.9 Teaching Biology
VI.10 Specimens and Nomenclature
VI.11 The Impact of Symbiosis and Symbiogenesis on the Study of Evolution and Its Historical and Philosophical Implications I
16:00-16.30 VII. 16.30-18:00
Coffee break, Lower Exhibition Hall
18:15-20:00
General Membership Business Meeting, Newman A
Newman B
Newman C
Newman D
Peter Chalk 1.1
Peter Chalk 1.3
Peter Chalk 1.5
Peter Chalk 1.6
Peter Chalk 2.6
Peter Chalk 2.5
Peter Chalk 2.3
Peter Chalk 2.1
V.1 The Genomic Revolution Revisited II
V.2 Critically Assessing The Changing Role of the Embryo II (author-meets- critics roundtable)
V.3 Sociology of the Biology of the Social
V.4 Darwinism in the 21st Century: Beyond the Modern Synthesis II
V.5 Talking to Scientists: Interpreting Interdisciplinary Communication (roundtable)
V.6 The Alberch Variations: Pere Alberch and the Cradle of Today’s Evo-Devo
V.7 Darwinian Themes
V.8 Environment and Ecology
V.9 Ethics and Evolution II
V.10 Selection II
V.11 New Perspectives on Biological Systems
VII.1 Evolutionary Ethics in Evolution
VII.2 Carving Out Action Potentials: Electrophysiology and the Causal Structure of the Nervous System II
VII.3 Biota, Biology, Place, and Belonging
VII.4 Reducing Anti-reductionism: Perspectives on Holism in the Life Sciences
VII.5 The Importance of Homology for Biology and Philosophy II
VII.6 Biology and Society
VII.7 Animal Minds
VII.8 Palaeobiology and Phylogenetics
VII.9 19th Century Theories of Evolution
VII.10 History of Molecular Biology
VII.11 The Impact of Symbiosis and Symbiogenesis on the Study of Evolution and Its Historical and Philosophical Implications II
Saturday, July 28
VIII. 09.00-10.30
VIII.1 Cultural Inheritance and Niche Construction: Historical and Philosophical Perspectives I: Contexts
VIII.2 Mechanisms and Causation I: Molecular Machines, Mechanisms and Systems
VIII.3 What (If Anything) are the Meaning and Implications of Gene-P? (roundtable)
VIII.4 Sports, Freaks, Monsters and Mutants: Toward a History of Mutation I
VIII.5 Mechanism and its Discontents in 20th-Century Biology I: Germany
VII.6 Medicine and Biology
VIII.7 History of Evolutionary Biology
VIII.8 Hybrids in Ecology: (Post-) Normal Science and the Interface of Interdisciplinary Practices
VIII.9 Evolution and Social Behaviour
VIII.10 The Science of Sex and Gender I
VIII.11 Microbial Ontology
10:30-11.00 IX. 11.00-12:30
Coffee break, Lower Exhibition Hall
12:30-14.00
IX.1 Cultural Inheritance and Niche Construction:
Historical and Philosophica Perspectives II:
Connections
Lunch break, Lower Exhibition Hall Council Meeting II, Peter Chalk 1.4
X. 14.00-15:30
X.1 The Epistemology of Development, Evolution and Genetics I (Essays in Honour of Richard Burian)
X.2 Plastic Minds: A developmental Perspective on Animal Behaviour
X.3 Reproduction and Reproductive Technology
X.4 Information and Biological Development I
X.5 Complexity, Robustness, and Explanatory Power in Biological Modelling
X.6 Optimality Modelling and Evolutionary Explanation
X.7 History of Classical Genetics
X.8 Bioenergetics
X.9 Representations of Nature in Life Science Pedagogy
X.10 Canguilhem in Context
X.11 Property of Life: Representations and Reproductions of Life in (Intellectural Property) Law I
15:30-16:00 XI. 16.00-17:30
Coffee break, Lower Exhibition Hall
19:00-
Dinner, Holland Hall
Newman B
Newman C
Newman D
Peter Chalk 1.1
Peter Chalk 1.3
Peter Chalk 1.5
Peter Chalk 1.6
Peter Chalk 2.6
Peter Chalk 2.5
Peter Chalk 2.3
Peter Chalk 2.1
XI.1 The Epistemology of Development, Evolution and Genetics II (Essays in Honour of Richard Burian)
XI.2 Philosophies of Biology: Naturalistic, Transcendental or Beyond? (Octavian Discussion)
XI.3 Ideology in Biology
XI.4 Information and Biological Development II
XI.5 Cultural Evolution
XI.6 Explanation by Analogy: Metaphors, Models, or Comparison of Natural Entities?
XI.7 Complexity, Systems, Teleology
XI.8 25 Years on from Marr’s Vision: Philosophical Perspectives on the Boundary Between Neurobiology and Cognitive Science
XI.9 New Methods for Teaching History and Philosophy of Biology: Museums, Field Stations to the Internet (roundtable)
XI.10 New Issues in Levels of Individuality and Units of Selection
XI.11 Property of Life: Representations and Reproductions of Life in (Intellectural Property) Law II
IX.2 Mechanisms and Causation I: Ecology and Evolution
IX.3 DST and the Unification of Biology
IX.4 Sports, Freaks, Monsters and Mutants: Toward a History of Mutation II
IX.5 Mechanism and its Discontents in 20th-Century Biology II: Britain
IX.6 Perspectives on the Biological Sciences in Nazi Germany and Beyond
IX.7 Genomics: New Paradigm?
IX.8 Developing Digital Databases and Collaborations (open forum)
IX.9 Unification, Autonomy, and the Future of Biology
IX.10 The Science of Sex and Gender II
IX.11 Making Microbes Visible to the Philosophy of Biology (roundtable)
Sunday, July 29
XII. 09.00-10.30
XII.1 Complexity Management in Biology: Philosophical and Socialogical Views I
XII.2 Mechanisms, Functions, Organization, and Emergence: New Perspectives on Reductionism I
XII.3 Stasis and Change in Development and Evolution
XII.4 Essentialism and Classification I
XII.5 Metaphysics of Natural Selection, Random Drift, and Mutation: Probability, Causation, and Modality I
XII.6 History of Biomedicine
XII.7 Philosophies of Science in the 18th Century: Critical Reflections on Methodologies for the Biological Sciences
XII.8 Ecology I
XII.9 Mathematics in Biology
XII.10 Foundations for a Genuinely ‘Cognitive’ Biology
XII.11 Pragmatism and Evolutionary Biology
10:30-11.00 XIII. 11.00-12:30
Coffee break, Lower Exhibition Hall
Newman B
Newman C
Newman D
Peter Chalk 1.1
Peter Chalk 1.3
Peter Chalk 1.5
Peter Chalk 1.6
Peter Chalk 2.6
Peter Chalk 2.5
Peter Chalk 2.3
Peter Chalk 2.1
XIII.1 Complexity Management in Biology: Philosophical and Socialogical Views II
XIII.2 Mechanisms, Functions, Organization, and Emergence: New Perspectives on Reductionism II
XIII.4 Essentialism and Classification II
XIII.5 Metaphysics of Natural Selection, Random Drift, and Mutation: Probability, Causation, and Modality II
XIII.8 Ecology II
XIII.10 Evolutionary Epistemology
XIII.11 Premodern Biology
Conference Schedule
First authors are presenting authors. * session organizer
Wednesday July 25
14:30-19:00 Registration
20:00- Young Benson Welcome Drinks Reception
Thursday July 26
9:30-11:00 11:30-13:00
Lower Exhibition Hall The Imperial
Great Hall Great Hal
Newman B Newman C
Newman D
Peter Chalk 1.1
Peter Chalk 1.3
The Mexican Institute for Genomic Medicine (INMEGEN) and the invention
of the Mexican ‘Mestizo’ genome
Ludovica Lorusso, Giovanni Boniolo: Clustering humans: Boundaries and properties George Ellison, Simon Outram: The business of racial criticism in biomedical research Snait B. Gissis: When is ‘race’ a race? The use of the race category in genetics, epidemiology and medicine in recent decades
Thursday Plenary Session
Presidential Address
Race and Genomics: Old Wine in New Bottles? (Welcome Plenary) George Ellison, Jean Gayon, Renato Mazzolini, Jenny Reardon
Thursday Session I
I.1 Exploring Possible Publishing Venues for ISHPSSB (roundtable)
Garland Allen, Keith Benson, Jason Byron, Carl Craver*, Lindley Darden, Jane Maienschein, Staffan Müller-Wille; Chair: Carl Craver*
I.2 What Happened to Evolution After the Synthesis?
Jeffrey Schwartz*: Was the modern synthesis really a synthesis? Massimo Pigliucci: The (almost) forgotten phenotype
Gerd Müller: Where EvoDevo goes beyond the Modern Synthesis Chair: Jeffrey Schwartz*
I.3 Emergence, Reduction and Biological Systems
Mark Bedau: Pluralism about emergence in biology
Athel Cornish-Bowden, Maria Luz Cardenas: Metabolic circularity as
a guiding vision for systems biology
Alexander Powell*: Emergence, causation and levels in biological systems Chair: John Dupré
I.4 Gender and Genetics
Marsha Richmond: Conflict, controversy, and gender in early genetics: Selected case studies
Helga Satzinger*: A Weimar mongrel: The debates in biology and art on gender, race, and genes
Isabel Delgado Echeverria: Nettie Maria Stevens and the controversy about biological sex determination
Chair: Pnina Abir-Am
I.5 Race in Biomedicine and Anthropology I
Francisco Vergara-Silva, Carlos López-Beltrán, Fabrizzio McManus:
Chair: Simon Outram
13:00-14:30 14:30-16:00
Raf De Bont: Between the laboratory and the deep blue sea:
The lab-field border in the marine stations of Naples and Wimereux Akihisa Setoguchi: War and biology: The transformation of entomological research in Japan, 1918-1945
Edmund Ramsden: The rats of NIMH and the urban crisis
Chair: Raf De Bont
LunchBreak Council Meeting I Thursday Session II
I.6 Logic of Discovery and Functional Ascriptions in Neuroscience: Bringing History and Current Research Together
Jean-Claude Dupont*: The history of integration: From Spencer
to Sherrington and later
Peter Chalk 1.5
Peter Chalk 1.6 Peter Chalk 2.5
Peter Chalk 2.3 Peter Chalk 2.1
Peter Chalk 1.4 Newman B
Newman C
Denis Forest*: Causal role theory of functions and theoretical changes in neuroscience
Chair: Denis Forest*; Commentator: Céline Cherici
I.7 Human Automatism
Christopher Smith*: Victorian physiology and human automatism Holly Andersen: Contemporary automatism: The reification of willing Samuel Thomsen: How emergence might overcome epiphenomenalism Chair: Christopher Smith*
I.9 Evolution and Communication
Paulo Abrantes: Mindreading and evolution
Kent Van Cleave: Transactional analogues: Non-semantic representation
in the mind and elsewhere
Simon Huttegger: The Evolution of simple communication systems
W. Brad Pitts, Gregory Morgan: Evolution without species: The case of mosaic bacteriophages
Chair: Paulo Abrantes
I.10 Intersections in the Evo-Devo Juncture I
Roberta L. Millstein*: The nearly neutral theory of Evo-Devo? Grant Yamashita: The germ is dead – long live the germ! Chair: Elihu M. Gerson*
I.11 Inside and Outside the Laboratory
II.1 Rebels of Life: Iconoclastic Biologists of the 20th Century I
Oren Harman*, Michael Dietrich*: On rebels, icons, and the value of dissent Garland Allen: Hans Driesch, rebel with two causes
Raphael Falk: Wilhelm Johansson: A rebel or a diehard?
Chair: Oren Harman*, Michael Dietrich*
II.2 Multi-Level Selection and Major Transitions: Groups, Individuals, and the Units of Evolution I
Richard E. Michod: Evolution of individuality during the transition from unicellular to multicellular life
Peter Godfrey-Smith: Darwinian populations and transitions in individuality Samir Okasha: Evolutionary transitions, levels of selection, and cross-level by-products
Chair: Andrew Hamilton*
II.3 Exploratory Experimentation in the Life Sciences
Newman D
Dick Burian: Exploratory Experimentation in Recent Molecular Biology and Genomics Kevin Elliott*: Varieties of Exploratory Experimentation in Nanotoxicology
Maureen O'Malley: Metagenomics and the proteorhodopsin case: Exploratory experimentation and its transformative effects
Peter Chalk 1.1 Peter Chalk 1.3
Peter Chalk 1.5
Peter Chalk 1.6
Peter Chalk 2.6
Peter Chalk 2.5
Chair/Commentator: Ken Waters
II.4 BioOntologies: A New Type of Theory in Biology?
Michael Ashburner: The application of ontologies in the biomedical domain Sabina Leonelli*: Bio-ontologies: A new means of travel for biological facts Rachel Ankeny: What lies beyond Babel? Lessons from the Worm Project Chair: Manfred Laubichler; Commentator: Massimo Pigliucci
II.5 Race in Biomedicine and Anthropology II
Yulia Egorova: DNA Evidence? Genetic anthropology and history
Sophia Efstathiou: Found science: Founding 'race' in science
Kathryn Plaisance: Behavioral genetics and the shared/nonshared environment distinction: How (not) to interpret behavioral genetic research
Chair: Yulia Egorova
II.6 Disease: 17th to 19th Centuries
Samantha Muka: Syphilis, the Church, and the body: Disease, cause, and treatment in 17th century England
Marion Thomas: Science, Religion and Politics: The Construction of Expertise on Cattle Plague in Pre-revolutionary France
Antoine Ermakoff: Jacques Tenon, Felix Vicq d'Azyr, and the hospital reforms of the end of the 18th century in France
Sarah Fisk: 19th century studies of hysteria and the work of Jean-Martin Charcot Chair: Antoine Ermakoff
II.7 The Statistical Roots of Biology
Eileen Magnello: The role of evolutionary biology in the establishment of mathematical statistics
Nancy Hall*: Did Fisher’s voluntary workers at Rothamsted make a difference in the spread of statistical techniques in agriculture?
Chair: Nancy Hall*; Commentator: Gregory Radick
II.8 Theory in Biology
Michael Bradie: Popper’s dance with Darwin
Astrid Juette: A story about story telling
Julien Delord: Neutral theories and the unification of evolutionary biology
Donato Bergandi: Holism-reductionism debate in ecology, ethics and sustainable development Chair: Donato Bergandi
II.9 Explaining Development
Roger Sansom: Why gene regulation networks are the controllers of ontogeny Michael Wheeler: What codes for what in development?
Laura Vandenberg, Carlos Sonnenschein, Ana Soto: It’s not in your genes but the company you keep: Phenotype, a view from the bench
Jesse Hendrikse: Scientific pluralism and the evolutionary explanation of development Chair: Jesse Hendrikse
II.10 Intersections in the Evo-Devo Juncture II Peter Chalk 2.3
Elihu M. Gerson*: Varieties of intersection: Specialties and collaboration networks Jason Robert: Evolutionary developmental medicine
Chair : Roberta L. Millstein*
16:30-18:00
Sören Häggqvist: The select few: Etiological functions and normativity Françoise Longy: Function as an overarching concept
Gillian Barker: How systems fail: Function, malfunction and dysfunction Chair: Gillian Barker
Thursday Session III
II.11 Functions I
Peter Chalk 2.1
III.1 Rebels of Life: Iconoclastic Biologists of the 20 th ) Century II
David Hull: Leon Croizat: A radical biogeographer
William Dritschilo: Dan Simberloff and methodological succession in ecology
Ute Deichmann: Different research practices in early molecular genetics: Oswald T. Avery’s and Max Delbrück’s revolutionary findings and early responses
Chair: Oren Harman*, Michael Dietrich*
Newman B
Newman C
Newman D
Peter Chalk 1.1
Peter Chalk 1.3
Peter Chalk 1.5
III.2 Multi-Level Selection and Major Transitions: Groups,
Individuals, and the Units of Evolution II
Andrew Hamilton*: What makes a group an evolutionary unit? Reliability and the transition to sociality in hymenopterans
Alirio Rosales: Multilevel selection, evolutionary transitions, and adaptive complexity Chair: Grant Yamashita
III.3 Conceptual Development in Industrial Contexts: Breeding, Horticulture and Sericulture in the 19 th ) and 20 th ) Centuries
Philip J. Pauly: Mums as the Measure of Men: Global Plant Culture in the Nineteenth Century
Theodore Varno: Naturalizing Selection: Ronald A. Fisher and the Rothamsted Experimental Station, 1919-1933
Lisa Onaga: Silkworm Breeding and the Development of Genetics in Meiji Japan Chair: Jonathan Harwood; Commentator: Barbara Kimmelmann*
III.4 Changing Perceptions, Changing Cultures: Molecular Biology
and the Rise of the Biotech Industry
Pnina Abir-Am: DNA at 40: The impact of biotech on collective memory in molecular biology (1993, 1953)
Michael Bürgi: From organic chemistry to molecular biology: Practical, institutional and strategic shifts in drug development at Hoffmann-La Roche, 1960–1980
Thomas Wieland*: Coping with the ‘Hoechst shock’: Perceptions and cultures of molecular biology in Germany
Chair/Commentator : Helga Satzinger
III.5 Molecular Anthropology: Perspectives from History, Philosophy, and Human Geography
Marianne Sommer*: Natural genealogies and the objectivity of approaches, technologies and objects in molecular anthropology
Catherine Nash: Mapping global mobilities: Family connections and difference in the genographic project
Lisa Gannett: From flies to humans: The genetic basis of group identity
Chair : Marianne Sommer*; Commentator: Jeffrey Schwartz
III.6 Anagenetic Optimisation Versus Cladogenetic Differentiation
Michael Gudo*: The Frankfurt-theory of constructional morphology: An innovative but unknown approach for reconstructing anagenetic events and its actual importance for understanding chordate evolution
Tareq Syed: Anagenesis and cladogenesis in deuterostome evolution: Well-known molecular phylogenies and well-forgotten morphological models
Chair/Commentator : Mathias Gutmann
III.7 Varieties of Mechanism
Peter Chalk 1.6
Rasmus Winther: Mechanisms, history and parts in compositional biology
Bert Leuridan: Can mechanisms replace laws of nature?
Trevor Pearce: The scorpion’s sting: Functions, mechanisms, and biomechanical explanation
Christopher Eliot: Ecological mechanisms Chair: Christopher Eliot
Peter Chalk 2.6
Peter Chalk 2.5
III.8 Communication and Aesthetics
Barton Moffatt: Signaling Processes and Biological Function: An Account of Signal in Cellular Biology
Tobias Cheung: Anthropo-biology in the 1940s: Jakob von Uexküll, Norbert Wiener and Arnold Gehlen on the functional circle of inside-outside-relations
Naomi Dar: Are biological structures aesthetic? Chair: Tobias Cheung
III.9 Development, Inheritance and Evolution
Peter Gildenhuys: Inheritance in Griffiths and Gray’s Developmental Systems Theory Matteo Mameli: A general theory of inheritance and its implications
Beth Hannon: Fetal programming, predictive adaptive responses and gene-centric thinking
John Sarnecki: Developmental objections to evolutionary modularity Chair: Peter Gildenhuys
III.10 Botany Between Knowledge and Science: Images, Interspaces, Experiences and Gender
Nicolas Robin*: Discussing the “translation” of J. W. von Goethe’s knowledge of nature into scientific literature for women
Peter Chalk 2.3
Peter Chalk 2.1
Friday Session IV
IV.1 The Genomic Revolution, Revisited I
Vincent Ramillon: The material economy of genomic research: Automation, work division, and productivity
Monika Piotrowska: What does it mean to be 75% pumpkin: The units of comparative genomics
Peter Taylor: When is genetic analysis useful and sustainable? Perspectives on some new and old debates about genes and environment
Jane Calvert, Joan Fujimura: Systems biology: The revolution after the revolution? Chair: Edna Suarez*; Commentator: Bruno Strasser*
Marianne Klemun*: Systematic botany in the romantic Vienna and “Voyages into the flower fields of life”
Alexandra Cook: Between Praxis and Episteme: The Herbarium as Boundary Object Dawn Sanders: Private letters, public discourse: The botanical correspondence of Mary Treat and Charles Darwin
Chair: Marianne Klemun*
III.11 Functions II
Robert Gadda: Developmental systems and etiological theories of teleology Predrag Sustar: Functions in the morphospace
Georg Toepfer: Systems of functions: Functional attribution and functional decomposition in biology
Chair: Predrag Sustar
Friday July 27
9:30-11:00
Newman B
IV.2 Critically Assessing The Changing Role of the Embryo I (author-meets-critics roundtable)
Raphael Falk, James Griesemer, Jonathan Hodge, David Hull, Carlos Lopez Beltran, Francisco Vergara-Silva, Rasmus Winther*, Ron Amundson (author)
Chair: Rasmus Winther*
Newman C Newman D
IV.3 Systems Biology: Computational Models Integrating Evolution,
Function, and Design
Philippe De Backer, Dirk Gevers, Kyung-Bum Lee, Toshishiro Aono, Chi-Te Liu,
Shino Suzuki, Tadahiro Suzuki, Takakazu Kaneko, Manabu Yamada, Satoshi Tabata, Doris M. Kupfer, Fares Z. Najar, Grahan B Wiley, Bruce Roe, Hiroshi Oyaizu,
Marcelle Holsters: Comparative and evolutionary genomics of Azorhizobium caulinodans as a case study for the workings of the post-genome era
Beckett Sterner*: Reconnecting evolutionary and descriptive biology: A network effect in systems biology
Joris Van Poucke, Philippe De Backer, Gertrudis Van de Vijver, Marcelle Holsters, Dani De Waele, Linda Van Speybroeck: Anti-reductionism and modelling in systems biology: Different perspectives
Chair: Gertrudis Van de Vijver; Commentator: Linda Van Speybroeck
Peter Chalk 1.1
Peter Chalk 1.3
Peter Chalk 1.5 Peter Chalk 1.6
Peter Chalk 2.6
IV.4 Darwinism in the 21st century: Beyond the Modern Synthesis I
Ehud Lamm: Evolution of networks and networks of evolution
Eva Jablonka*: The developmental aspect of heredity and evolution
Marcello Buiatti*: The “benevolent disorder” and recognition processes as conditions for the different adaptation strategies of prokaryotes, eukaryotes and humans Chair: Eva Jablonka*
IV.5 Social, Historical, and Philosophical Perspectives on the Inflation of Gene-related Knowledge
C. Kenneth Waters*: Getting real about genetics and genomics: An anti-realist perspective
Hans-Jörg Rheinberger: On the dynamics of laboratory research: Views on molecular genetics
Christine Hauskeller: The promises of genomics: Only society makes them reality! Chair: C. Kenneth Waters*
IV.6 Global Food Security and Science Policy: A Canadian Perspective (roundtable)
David Castle*, Keith Culver, James Tansey
Chair: David Castle*
IV.7 Biomedicine
Carlos Guevara-Casas: Methodological convergence of conceptual interpretations in medicine and taxonomy
Susan Rogers: Building bioinformatic knowledge: Interlinking social networks and producing a valid microarray experiment
Justin Biddle: Non-malfeasance and the privatization of biomedical research Chair: Justin Biddle
IV.8 Exploring Unknown Worlds
Carol Cleland: Could there be undetected alternative forms of microbial life on earth? Carlos Ochoa Olmos: Search for extraterrestrial intelligence: A Kuhnian approach
on the matter
Christophe Malaterre: Explaining the origins of life on earth: Three explanatory schemes and a set of limit conditions
Chair: Carol Cleland
11:30-13:00
the 1930s
Chair : Andy Hammond*
IV.9 Ethics and Evolution I
Peter Chalk 2.5
Benjamin Lazier: Aristotle, again: Jürgen Habermas, Leon Kass and the ethical self-understanding of the species
Barry Barnes: Biological explanations of human actions and the institution of responsible action
Angela Breitenbach: Connections between purpose and value in nature Chair: Angela Breitenbach
Peter Chalk 2.3
Peter Chalk 2.1
Newman B
Newman C Newman D
IV.10 Selection I
Abhijeet Bardapurkar: What is “natural” in natural selection?
Tim Lewens: Forces and causes, probabilities and populations: Clarifying the metaphysics of selection
Bartlomiej Swiatczak: Natural selection and the problem of reduction in life sciences Chair: Abhijeet Bardapurkar
IV.11 Episodes in the History of Speciational Mechanisms
Fern Elsdon-Baker: Darwinism and the ever changing definitions of the ‘inheritance of acquired characteristics’
Andy Hammond*: JBS Haldane and speciation: Not a beanbag but a full bag
Joe Cain: An epistemic community glued together: Evolutionary studies in
Friday Session V
V.2 Critically Assessing The Changing Role of the Embryo II (author-meets-critics roundtable)
Raphael Falk, James Griesemer, Jonathan Hodge, David Hull, Carlos Lopez Beltran, Francisco Vergara-Silva, Rasmus Winther*, Ron Amundson (author)
Chair: Rasmus Winther*
V.4 Darwinism in the 21 st ) century: Beyond the Modern Synthesis II Ayelet Shavit: Location, location, location! Negotiating places and perspectives in a biodiversity database
Ohad Parnes: From transmission to plasticity: The changing concept of heredity since the middle of the twentieth century
Chair : Eva Jablonka*
V.1 The Genomic Revolution, Revisited II
Bruno J. Strasser*: Natural history in the genomic age? The making of GenBank, 1982-1987
Edna Suárez*, Víctor-Hugo Anaya: Evolutionary tools and comparative genomics: Continuity in the shadow
Christophe Bonneuil, Jean-Paul Gaudilliere: Navigating the post-Fordist DNA: Network, regulations and variability in genomics and society
Chair : Edna Suarez*
V.3 Sociology of the Biology of the Social
Sharyn Clough: Triangulation, social location and ophthamology:
Do you see what I see?
Nicole Nelson: Politicizing methodology: Standardization debates in behavior genetics Gesa Lindemann: Neuronal expressivity: A new technology of innocence
Batya Zelinger: Sociobiology and evolutionary psychology in the service of “instrumental rationality”
Chair : Sharyn Clough
Peter Chalk 1.1
V.5 Talking to Scientists: Interpreting Interdisciplinary Communication (roundtable)
Soraya de Chadarevian, Nathaniel Comfort, Elihu Gerson, Gail Schmitt* Chair: Gail Schmitt*
V.6 The Alberch Variations: Pere Alberch and the Cradle
of Today’s Evo-Devo
Laura Nuño de la Rosa: A reconstruction of the conceptual phylogeny of Pere Alberch within the tree of EvoDevo
Miquel De Renzi: Alberchian variations on evolutionary palaeobiology Arantza Etxeberria: Developmental constraints and possible life
Diego Rasskin-Gutman*: Evo-Devo today
Chair : Diego Rasskin-Gutman*
Peter Chalk 1.3 Peter Chalk 1.5
Peter Chalk 1.6
Peter Chalk 2.6 Peter Chalk 2.5
Peter Chalk 2.3 Peter Chalk 2.1
Newman B Newman C
V.7 Darwinian Themes
Sara Schwartz: The nature of competition and competition in nature
Shunkichi Matsumoto: Evaluating the debate on genic selectionism: Based on the heterozygote superiority case
Arno Wouters: What did Darwin do to teleology?
Chair : Sara Schwartz
V.8 Environment and Ecology
Stefan Linquist: But is it progress? On the alleged advances of conservation biology over ecology
Todd Grantham, Mark Bedau: Geographic range as a weakly emergent trait
Chair : Mark Bedau
V.9 Ethics and Evolution II
Chris Zarpentine: Evolutionary moral psychology and moral philosophy Tomislav Bracanovic: Altruism and morality: Is disentangling really necessary? Vasco Castela: Virtuous behaviour need not be an evolutionary stable strategy Sherrie Lyons: East meets West: Buddhism, neuroplasticity and mirror neurons. Revisiting evolutionary ethics
Chair : Barry Barnes
V.10 Selection II
Jessica Pfeifer: Selection vs. drift: Apportioning causal responsibility Patrick Bateson: Has natural selection outlived its usefulness? Chair: Patrick Bateson
V.11 New Perspectives on Biological Systems
Brian Goodwin: The language of living processes
Jonathan Davies*: Distributed and local causation in systems biology Ulrich Krohs: How systems biology makes sense of (gen)omics
Werner Callebaut: From systems biology to evo-devo and back
Chair : Lenny Moss
13:00-14:30 Lunch Break
Graduate Student Member General Meeting
13:15 Lunch roundtable: Genes->Interpretation->Impact on Science or Society: What have ISHPSSBers achieved by critiquing genetic analyses
(of various kinds)
Bruno J. Strasser, Edna Suáárez , Víctor-Hugo Anaya, Christophe Bonneuil, Jean-Paul Gaudilliere, Vincent Ramillon, Monika Piotrowska, Peter Taylor*, Jane Calvert, Joan Fujimura
Chair: Peter Taylor*
14:30-16:00
Friday Session VI
VI.1 Who is Highjacking Systems Biology? The Problem of Multilevel Explanation in Systems Biology
Paul-Antoine Miquel: Is there nomological closure in explanations in biology? Giuseppe Longo: From the “DNA is a program”, a misleading model and metaphor in molecular biology, toward the role of randomness and extended criticality of living entities
Newman B
Newman C
Newman D
Peter Chalk 1.1
Peter Chalk 1.3
Peter Chalk 1.5
Denis Noble: Middle-out hierarchical options in causation
Ana Soto, Carlos Sonnenschein*: Physicalism, diachronic emergence and downward causation in experimental biology
Chair : Carlos Sonnenschein*
VI.2 Carving Out Action Potentials: Electrophysiology and the Causal Structure of the Nervous System I
William Bechtel: Delineating the phenomenon for electrophysiology:
Emil du Bois-Reymond and his students Ludimar Hermann and Julius Bernstein Kenneth Schaffner: Theories, models, and equations in biology: The heuristic search for emergent simplifications in neurobiology
Don Goodman: Woodward’s modularity condition, the causal Markov condition, and the causal structure of the central nervous system
Chair : Marcel Weber*
VI.3 Models and Experiments
Brad Wilson: Bridging the gap between theory and experiment in ecology
Jessica Bolker: Models-of and models-for: Two modes of representation in biological research
Monica Maria Máárquez-Sáánchez: Idealization and model organisms
James Marcum: Horizons for scientific practice: Scientific discovery and progress Chair: Jessica Bolker
VI.4 Ecology, Environment and Politics
Ageliki Lefkaditou, Anastasia G. Stamou, Dimitrios Schizas, George P. Stamou:
Environmental information in a Greek forest reserve: Scientific rhetoric and images of nature
Carlo Marcello Almeyra: Public Participation in the Mutual Conformation of Science, Technology and Society as a Problem for Applied Ecology
Renard Sexton: Public Policy Implications of Environmental Mechanisms Chair: Peter Bowman
VI.5 The Importance of Homology for Biology and Philosophy I
Paul Griffiths: The phenomenon of homology
Ingo Brigandt: Typology now: Homology and developmental constraints explain evolvability
Alan Love: Functional homology and homology of function
Chair : Marc Ereshefsky*
VI.6 Social and Technological Dimensions of Biology
Victor Rodriguez, Koenraad Debackere: Material transfer agreements and policy implications: Strategies for research materials in biotechnology
Miguel Garcia-Sancho: Creating a genetic language: DNA sequencing and the first modern biological databases (1965-1985)
Richard Holdsworth: Different disciplines, different perspectives on the pertinence of genomics to ways of studying human behaviour: Lessons of interviews with researchers Chair: Miguel Garcia-Sancho
16:30-18:00
VI.11 The Impact of Symbiosis and Symbiogenesis on the Study of Evolution and Its Historical and Philosophical Implications I
Ulrich Kutschera: Endosymbiosis and cell evolution: The history of an idea Francisco Carrapiço: From symbiosis to symbiome: An epistemological approach Chair: Nathalie Gontier*
Friday Session VII
VI.7 Metaphors in Biology Peter Chalk 1.6
Andrew Reynolds: The perspective metaphor of metaphor
Brendon Larson: Towards an ethics of biological metaphor: The case of promotional metaphors Jennifer Runke*: Towards an adequate theory of metaphor in biology
Erica Torrens, Ana Barahona: Is the tree of life metaphor really necessary?
Chair : Jennifer Runke*
VI.8 18th> - and 19th -Century Biology Peter Chalk 2.6 Alexei Kouprianov: “If we only could combine Tournefort’s drawings with Rivinus’s
definitions:” The positive program by Johann Georg Siegesbeck (1686-1755) for systematic botany Céline Cherici: Félix Vicq d’Azyr’s understanding of human cerebral structures and
contribution to the field of brain anatomy in the late eighteenth century in France Thomas Burnett: Extinction in German Natural History, 1790-1830
Helen Blackman: The Cambridge school of animal morphology, 1882-1910 Chair: Helen Blackman
Peter Chalk 2.5
Peter Chalk 2.3
Peter Chalk 2.1 Newman B
Newman C
VI.9 Teaching Biology
Thierry Artzner: Political science at the école Libre des Sciences Politiques (Paris), 1870-1890
Adam Shapiro: Textbook authors and textbook salesmen: contrasting communities of biology knowledge production
Hyung Wook Park: Cytology textbooks, multidisciplinarity, and the making of the new science of aging in the United States, 1924-1945
Neil Haave: Why teach history and philosophy of biology to biology majors? Chair: Thierry Artzner
VI.10 Specimens and Nomenclature
Taika Dahlbom: Specimens: Between nature and the zoological gaze Tomomi Kinukawa: Metamorphosis of the private time: Natural history as entrepreneurship in Early Modern Dutch Atlantic
Rebecca Ellis: DNA bar-coding: A mere tool or the potential to remake our relationship with life?
Yann Bertrand: On sameness and reference in biological nomenclature Chair: Rebecca Ellis
VII.1 Evolutionary Ethics in Evolution
Eric Charmetant*: Towards analogues of ordinary morality in apes
Christine Clavien*, Chloëë Fitzgerald: The impossibility of evolutionary realism Jérôôme Ravat: Can naturalized ethics help us find moral truths?
Chair : Eric Charmetant*
VII.2 Carving Out Action Potentials: Electrophysiology and the Causal Structure of the Nervous System II
Marcel Weber*: Causes without mechanisms: The Hodgkin-Huxley model revisited Carl Craver: When mechanistic models explain: The Hodgkin and Huxley Model of the action potential
Daniel Sirtes: The nexus, mechanisms and mechanism families Chair: Marcel Weber*
VII.3 Biota, Biology, Place, and Belonging Newman D
Matthew Chew*: H.C. Watson and the civil claims of “British” plants
Jesse Gryn, Christopher Buddle, Charles Vincent: The blueberry maggot goes to
Harvard: Guy Bush, Ernst Mayr, and the controversy over sympatric speciation
Nathan Robert Smith, Michael Trestman: Evaluating the risk posed by biological invasions Mark Barrow: On the trail of the ivory-bill: Science and the struggle to save an
endangered species
Chair : Matthew Chew*
Peter Chalk 1.1
Peter Chalk 1.3 Peter Chalk 1.5
Peter Chalk 1.6
Peter Chalk 2.6
Peter Chalk 2.5
VII.4 Reducing Anti-reductionism: Perspectives on Holism in the Life Sciences
Shane Glackin: Facts and values against the reduction of ethics to biology Riin Magnus: From classical holism to the biosemiotic turn, 1920-1940
Jamie Stark*: Beyond atomism and holism: The anti-reductionist community in the 20th ) century
Chair : Sabine Brauckmann
VII.5 The Importance of Homology for Biology and Philosophy II
Marc Ereshefsky: Behavioral homology and psychological categories Mohan Matthen: Cognitive kinds and homology
Chair : Marc Ereshefsky*; Commentator: Gerd Müller
VII.6 Biology and Society
Irama Núñez, Ana Barahona: Transgenic corn through the perspective of communication
Nico Luedtke, Hironori Matsuzaki: The emergence of animal law: On institutional conditions of research in life sciences in Germany, The USA and Japan
Glenn Sanford: Educating citizens: Scientific literacy and public policy
Cor van der Weele: Justifying the moral agenda on genomics
Chair : Ana Barahona
VII.7 Animal Minds
Hugo Viciana, Hugo Mercier: Convergent Minds? Examining some current assumptions in the study of comparative social cognition of apes, crows, dogs, children and other animals
Simon Fitzpatrick: Simplicity and methodology in animal psychology: A case study Gregory Radick: Vervetese and its Contexts
Chair: Ulrich Stegmann*
VII.8 Palaeobiology and Phylogenetics
Keynyn Brysse: From weird wonders to stem lineages: the second reclassification of the Burgess Shale fauna
Derek Turner: The trendiness of paleobiology
Joel Velasco: Prior probabilities in phylogenetic inference
Fabrizzio McManus: Rational disagreement in phylogenetics: Maximum parsimony or maximum likelihood?
Chair : Keynyn Brysse
VII.9 19th -Century Theories of Evolution
Albert Peacock: Vestiges of the Natural History of Creation in America: A quick response in the years 1844-1847
Juan Manuel Rodríguez Caso, Rosaura Ruiz Gutiérrez: Alfred R. Wallace and his vision of anthropology and evolution
Roberto de Andrade Martins, Juliana Ferreira: Alfred Russel Wallace’s claims regarding spiritualism
Chair : Roberto Martins
9:00-10:30
Saturday Session VIII
VII.10 History of Molecular Biology
Peter Chalk 2.3
Susie Fisher: Not beyond reasonable doubt: A re-examination of Howard Temin’s provirus hypothesis
Howard Chiang: Separating molecules, building biology: The evolution of electrophoretic instrumentation and the material epistemology of molecular biology, 1945-1965
Jérôme Pierrel: Sequencing RNA in the 1960s and 1970s: An “RNA world”? Norberto Serpente: The visualisation of the invisible in cell biology:
The use of models describing cell function as a consequence of the molecular revolution (1970-2000)
Chair: Howard Chiang
VII.11 The Impact of Symbiosis and Symbiogenesis on the Study of Evolution and Its Historical and Philosophical Implications II Richard A. Watson: Compositional evolution and symbiosis
Nathalie Gontier*: Ontological and epistemological implications of symbiosis and symbiogenesis
Saturday July 28
Peter Chalk
2.1 Newman A
Newman B
Newman C
Newman D
Peter Chalk 1.1
18:15-20:00
Chair: Francesco Carrapiço General Business Meeting
VIII.1 Cultural Inheritance and Niche Construction: Historical and Philosophical Perspectives I: Contexts
Maria Kronfeldner*: In the name of culture: The history and importance of cultural inheritance
Ben Jeffares: The archaeology of cultural inheritance in early Homo Kim Sterelny: Moral nativism: A sceptical response
Chair : Manfred Laubichler
VIII.2 Mechanisms and Causation I: Molecular Machines, Mechanisms and Systems
Michel Morange: The increasing place of macromolecular machines in the descriptions of molecular biologists: What role do they play in explanations? Lindley Darden*: Mechanisms in biology in hierarchical context
Pierre-Alain Braillard: Systems biology and the mechanistic framework Chair: Robert Richardson
VIII.3 What (If Anything) are the Meaning and Implications of Gene-P? (roundtable)
Paul Griffiths, Lenny Moss*, Jonathan Kaplan, Ken Schaffner, Rob Wilson Chair: David Depew
VIII.4 Sports, Freaks, Monsters and Mutants: Toward a History
of Mutation I
Staffan Müller-Wille: Sub-specific variation in the nineteenth century
Salim Al-Gailani: Monstrosities and medical men: Obstetric encounters with teratology in Britain, 1850-1900
Igal Dotan*: Interrogation of a fly Chair: Hans-Jörg Rheinberger
VIII.5 Mechanism and its Discontents in 20th-century Biology I: Germany Christian Reiß: Julius Schaxel and the emergence of organicism in
Germany, 1910-1933
Jason Byron*: Holistic medicine and the rise of sexology in the Weimar Republic Kevin Amidon: “Do you know Rosenberg’s address?” Adolf Meyer-Abich
Peter Chalk 1.3
Peter Chalk 1.5
Peter Chalk 1.6
Peter Chalk 2.6
Peter Chalk 2.5
Peter Chalk 2.3
and biological
holism work toward the Führer Chair : Thomas Cunningham*
VIII.6 Medicine and Biology
Elizabeth Watkins: The medicalization of male menopause in America Elselijn Kingma: Harmful environments: A problem for the bio-statistical theory of health
Havi Carel: Unifying phenomenological and biological descriptions of disease Chair: Havi Carel
VIII.7 History of Evolutionary Biology
Daniel Becquemont: Intimations of natural selection: Patrick Matthew and Charles Darwin’s notebooks
John van Wyhe: ‘Darwin’s delay’: Another historiographical myth?
Grant Goodrich: Experimentation and the development of Lloyd Morgan’s canon Richard G. Delisle: Unpacking the Evolutionary Synthesis: How can so many epistemological and metaphysical issues stand within such a compact explanatory structure?
Chair: Daniel Becquemont
VIII.8 Hybrids in Ecology: (Post-)Normal Science and the Interface
of Interdisciplinary Practices
Thomas Potthast*: Epistemic-moral hybrids: Discussing ethical normativity
in the context of environmental interdisciplinarities: A case study of the federal nature protection agencies in Germany 1906-2006
Patrick Blandin: Eco-anthropology: A fertile hybrid? An epistemological approach to an evolving transdisciplinary field
Claire Waterton: Experimenting with the archive: Performance and emergence in the making of databases of nature
Astrid Schwarz*: Hybrids in ecology: Putting things in place Chair: Thomas Potthast*
VIII.9 Evolution of Social Behaviour
Ryan Muldoon, Michael Weisberg: Correlating strategies with neighbours even when the goal is anti-correlation
Tomi Kokkonen: Evolution of social behaviour by group selection
Yuridtizi Pascacio-Montijo: Beauty, Mate Selection and Evolutionary Psychology: A Critical Review
Chair: Tomi Kokkonen
VIII.10 The Science of Sex and Gender I
Sarah Richardson: Are men and women as different as humans and chimpanzees? Quantifying sex differences in the human genome
Sylvia Rolloff*: Explanatory models in behavioural endocrinology: Unifying
the mechanisms
Elisabeth Lloyd: Bias in evolutionary explanations of women’s orgasm Chair: Sylvia Rolloff*
11:00-12:30
George Levit: Global microbiology: One more step to a “New Synthesis”?
Mathias Grote: Plasmids: Between autonomous molecules and symbiotic organisms John Dupré*: Ontology from the microbe’s point of view
Chair : Carol Cleland
Saturday Session IX
IX.1 Cultural Inheritance and Niche Construction: Historical and Philosophical Perspectives II: Connections
John Odling-Smee: Niche inheritance: Its implications for human cultural inheritance Kenneth Reisman: The role of the environment in human cultural inheritance
Lauren McCall: Isolation vs. diffusion: A cross-cultural test
Chair : Manfred Laubichler
VIII.11 Microbial Ontology
Peter Chalk 2.1
IX.2 Mechanisms and Causation II: Ecology and Evolution
Newman B Newman C Newman D
Viorel Pâslaru: The new mechanistic philosophy and the mechanism of competition Stuart Glennan: Causal productivity, causal relevance and the nature of selection Matthew Barker*: Mechanism range and natural selection
Chair : Matthew Barker*; Commentator: Lindley Darden
IX.3 DST and the Unification of Biology
Anouk Barberousse: Developmental Systems Theory and the Neo-Darwinian Theory of Evolution
Marie-Claude Lorne*: Positional information and parity thesis
Francesca Merlin: DST’s concept of expanded inheritance: Is it too expanded? Thomas Pradeu: DST and the definition of the organism
Chair: Marie-Claude Lorne*; Commentator: Philippe Huneman*
IX.4 Sports, Freaks, Monsters and Mutants: Toward a History of Mutation II
Luis Campos*: “Complex recombinations”: Rethinking the death of de Vries’ mutation theory
Peter Chalk 1.1
Peter Chalk 1.3
Peter Chalk 1.5
Christina Brandt: Victor Jollos ́ research on Paramecium: Pure lines and the concept of Dauermodification
Chair : Igal Dotan; Commentator: Hans-Jörg Rheinberger
IX.5 Mechanism and its Discontents in 20th-century Biology II:
Britain
Thomas Cunningham*: A reply to naïve mechanicism: J. S. Haldane’s shift from vitalism to holism and its effects on his philosophy of biology
Rony Armon: Mechanism without reductionism: colloid chemistry and the mechanist conception of life
Chair : Garland Allen
IX.6 Perspectives on the Biological Sciences in Nazi Germany
and Beyond
Volker Roelcke: Population genetics and psychiatry in the 1930s: British scientists and their views of the Munich school of psychiatric genetics
Alexander v. Schwerin: Isotopes and animal models in biological research during National Socialism
Bernd Gausemeier: Leviathan and the ultracentrifuge: Politics, technology and the life sciences in National Socialist Germany
Chair : Carola Sachse; Commentator: Sheila Weiss*
12:30-14:00 14:00-15:30
John Dupré, Carol Cleland, Katie Kendig, Steve Hughes, Pamela Lyon, Maureen O’Malley*
Chair: Marc Ereshefsky
IX.7 Genomics: A New Paradigm?
Peter Chalk 1.6
Jon Umerez: Waddington’s symposia: A retrospective assessment
Laurence Perbal: The postgenomic era and a new systemic paradigm in biology? Thomas Reydon: How scientists use kinds: Genes and modules as a case study Chair: Laurence Perbal
Peter Chalk 2.6 Peter Chalk 2.5
Peter Chalk 2.3 Peter Chalk 2.1
Peter Chalk I.4 Newman B
X.2 Plastic Minds: A developmental perspective on
animal behavior
Colin Allen: Developing theories of imitation
Karola Stotz*: The construction of a developmental niche: A means for phenotypic plasticity Russell Gray: Development and cultural transmission of tool use in New Caledonian crows Chair: Karola Stotz*; Commentator: Patrick Bateson
IX.8 Developing Digital Databases and Collaborations (open forum)
Chair: Jane Maienschein
IX.9 Unification, Autonomy, and the Future of Biology
Marion Blute: What are the prospects for a biological theory of everything? Diedel Kornet: Biology and the other sciences; autonomy and cohesion Richard Creath: Cassirer’s history of hope
Constantinos Mekios: Implications of current applications of systems biology for the scientific autonomy of biology
Chair: Marion Blute
IX.10 The Science of Sex and Gender II
Rebecca M. Young: Symmetry failures in studies of hormonal organization of the human brain: A case study of a new tactic for critical science studies
Chair: Sylvia Rolloff*; Commentator: Sarah Richardson
IX.11 Making Microbes Visible to the Philosophy of Biology (roundtable)
Lunch Break Council Meeting II Saturday Session X
X.1 The Epistemology of Development, Evolution and Genetics I (Essays in Honour of Richard Burian)
Jean Gayon: How far is history of science relevant for philosophy of science? The case of the gene
Robert Richardson*: Integration and disintegration in evolutionary biology Ron Amundson: Burian’s Paradox and the future of evo-devo
Chair : Hans-Jörg Rheinberger
X.3 Reproduction and Reproductive Technology
Newman C Newman D
María Jesús Santesmases: Reproduction and cell cultures: Human genetics and prenatal testing in the baby-boom era
Bettina Bock v. Wülfingen: Founding the new discipline reproductive genetics: The role of model, theory and language
Karin Lesnik-Oberstein: The child that is wanted: Kinship and the body of evidence Alicia Villela: Debates in reproductive technologies: Semen banks and artificial insemination in USA
Chair : Bettina Bock v. Wülfingen
X.4 Information and Biological Development I
Peter Chalk 1.1
Nick Shea*: Developmental Systems Theory as a claim about inherited information Ulrich Stegmann*: Against causal and informational parity
Arnon Levy: Biological information as an explanatory metaphor
Chair : Paul Griffiths
X.5 Complexity, Robustness, and Explanatory Power in Biological Modeling
Michael Weisberg: Simplicity and generality in biological modeling Jay Odenbaugh: Robustness, multiple models, and realism
Peter Chalk 1.3 Peter Chalk 1.5
Peter Chalk 1.6
Peter Chalk 2.6 Peter Chalk 2.5
John Matthewson: Modeling trade-offs and scientific explanation Chair/Commentator: William Wimsatt
X.6 Optimality Modeling and Evolutionary Explanation
Joan Roughgarden: Optimality: Restoring life to the living
Stephen Downes: Life history theory, optimality modeling and evolutionary explanation Patrick Forber*: What good are optimality models anyway?
Angela Potochnik: Optimality explanation as anti-reductionism
Chair : Patrick Forber*
X.7 History of Classical Genetics
Fabricio Gonzalez-Soriano: Preventive discourse on pathological heredity: Materialization of medical power in the mexican civil law, 1870 to 1930
Mary Sunderland: T.H. Morgan’s multiple agendas related to regeneration
Lilian Pereira Martins: Opponents can help: Sturtevant, Morgan and the building of the first chromosome maps
Nils Roll-Hansen: Wilhelm Johannsen’s concept of the genotype Chair: Lilian Pereira Martins
X.8 Bioenergetics
Rivers Singleton: Early Concepts of Bioenergetics: Herman Kalckar, Fritz Lipmann, and Severo Ochoa
Douglas Allchin: Socializing Epistemics: Resolving the Ox-Phos Debate Chair/Commentator: Athel Cornish-Bowden
X.9 Representations of Nature in Life Science Pedagogy
Jenny Beckman: Linnaean traditions: School botany and biological recording
Ruthanna Dyer*: Learning through glass: Henry Ward and the Blaschka glass
animals in north america
Ingrid Birker, Tania Aldred: Dawson teaching sheets: 19th century natural science on cotton Chair: Ruthanna Dyer*
X.10 Canguilhem in Context Peter Chalk 2.3
Thomas Ebke: Norm and limit: Between Helmuth Plessner and Georges Canguilhem
Henning Schmidgen: Living concepts? Georges Canguilhem and the history of biological concepts Didier Debaise: What is a philosophy of individuation? Simondon’s theory of the living
Ugo Balzaretti: Michel Foucault and Georges Canguilhem: Biopolitics and the attempt
at a new biological foundation of the human sciences
Chair : Henning Schmidgen
X.11 Property of Life: Representations and Reproductions of
Life in (Intellectual Property) Law I Peter Chalk 2.1 Alain Pottage*, Brad Sherman: Representation and invention: Animate embodiments
Hyo Yoon Kang*: Genes are patents, patents are genes: The rise and fall of a scientific
metaphor in legal analogy
Chair : Staffan Müller-Wille
16:00-17:30
Saturday Session XI
XI.1 The Epistemology of Development, Evolution and
Genetics II (Essays in Honour of Richard Burian)
Manfred Laubichler: Regulatory gene networks: Historical and epistemological reflections Robert Brandon: Developmental constraints reconsidered in the light of the ZFEL
Chair : Hans-Jörg Rheinberger; Commentator: Richard Burian
XI.2 Philosophies of Biology: Naturalistic, Transcendental
or Beyond? (Octavian Discussion)
Werner Callebaut, Gertrudis Van de Vijver, Linda Van Speybroeck*, Dani De Waele, Lenny Moss, Jonathan Kaplan, Andrew Hamilton
Chair: David Depew; Commentators: Thomas Reydon, Jason Byron, Michel Morange, Filip Kolen, Helena De Preester, Joris Van Poucke
XI.3 Ideology in Biology
Newman B Newman C Newman D
Peter Chalk 1.1 Peter Chalk 1.3
Peter Chalk 1.5
Peter Chalk 1.6
Nick Hopwood: ‘Skandalon’: Haeckel’s pictures of embryos in the struggle of world views Eric Martin: Primordial Soup and the spice of life: J.B.S. Haldane between holism and mechanism
Valery N. Soyfer: Stalin and fighters against cell theory
Robert Bud: A sword from the field of battle: The double helix and the secret of life in 1950s Britain
Chair : Robert Bud
XI.4 Information and Biological Development II
Michael Trestman: The informational bee: The integrative role of a causal concept of information
Lindsay Craig: Information and DNA: How the unexplanatory metaphor explains Chair/Commentator: Paul Griffiths
XI.5 Cultural Evolution
Christopher DiTeresi: Making developmental biology second nature: Graduate courses as scaffolding for disciplinary inheritance
William Wimsatt: Modularity, memes, and scaffolding in cultural evolution Emily Schultz: Balinese water temples revisited: Approaching Steven Lansing’s Balinese ethnography from the perspective of constructivist evolutionary anthropology
Chair: Christopher DiTeresi
XI.6 Explanation by Analogy: Metaphors, Models, or
Comparison of Natural Entities?
Michael Bölker*, Tareq Syed: Genes and information
Mathias Gutmann*: Is information a metaphor, an allegory or a model? Winfried Peters, Suin Roberts, Bernd Buldt: Molecular machines: A metaphor in the making
Ben Rathgeber: Mirror neurons and intention-understanding Chair: Mathias Gutmann*
XI.7 Complexity, Systems, Teleology
Jean-Sébastien Bolduc: Evolution in light of Leibniz’s principle of the identity of indiscernibles
Hernán Pringe: Teleology and complementarity: Kant, Bohr, biology and atomic physics
Jason Zinser: Who’s afraid of irreducible complexity? Chair: Jean-Sébastien Bolduc
Sunday July 29
09:00-10:30
Sunday Session XII
XII.1 Complexity Management in Biology: Philosophical
and Sociological Views I
Alfonso Arroyo Santos: Systems biology and the limits of human cognition
Melinda Fagan: Stems and standards: Social mechanisms for managing complexity in immunology
Vivette García*: Modularity thinking as a way of managing complexity in developmental biology
Chair : Vivette García*
XI.8 25 Years on from Marr ́ ́s Vision: Philosophical Perspectives on the Boundary Between Neurobiology and Cognitive Science Matti Sintonen*: Scientific discovery, understanding, and the modelling of neurocomputational mechanisms
Peter Chalk 2.6
Peter Chalk 2.5 Peter Chalk 2.3
Peter Chalk 2.1
Newman B
Newman C Newman D
Oron Shagrir: Marr’s computational theories revisited
Otto Lappi, Anna-Mari Rusanen: Marr’s computational level and mechanistic explanation: Extending the notion of mechanism
Chair : Matti Sintonen*
XI.9 New Methods for Teaching History and Philosophy of Biology: Field Stations to the Internet (roundtable)
Mark Borrello* (Using field stations), Mary Sunderland, Henning Schmidgen (Using on-line resources), John M. Lynch (Blogging for teachers)
Chair: Mark Borello*
XI.10 New Issues in Levels of Individuality and Units of Selection
Frédéric Bouchard: What is a symbiotic superorganism and how do you measure its fitness?
Minus Van Baalen: How new units of selection may emerge in the course of evolution
Philippe Huneman*: Evolvability, transitions and the emergence of new individuals Chair: Philippe Huneman*
XI.11 Property of Life: Representations and Reproductions of Life
in (Intellectual Property) Law II
Tina Piper: An ongoing dialogue: understanding life sciences through the lens of patent law in the early twentieth century
John Emrich: The laws of life: The first patent of an engineered life form Chair: Staffan Müller-Wille; Commentator: Jane Calvert
XII.2 Mechanisms, Functions, Organization, and Emergence:
New Perspectives on Reductionism I (roundtable)
Marcel Weber, Manfred Laubichler, Sahotra Sarkar, Jason Scott Robert, Colin Allen, Alan Love, Paul Griffiths, Karola Stotz*, William Bechtel, Bob Richardson,
William Wimsatt, Mark Bedau, Todd Grantham
Chair: Karola Stotz*
XII.3 Stasis and Change in Development and Evolution
Jonathan Kaplan*: Evolutionary stasis and developmental stability: Are they related? James Maclaurin: Universal development
Brett Calcott: Two ways that modules enable evolvability
Chair : Jonathan Kaplan*
XII.4 Essentialism and Classification I
Peter Chalk 1.1
John Wilkins*: The unseasonable lateness of Being-What-It-Is, or, the myth of biological essentialism
Gal Kober: Biology without species
Yuichi Amitani: The role of “good species” in the species problem
John Collier: Review of the cohesion concept of species Chair: John Wilkins*
XII.5 Metaphysics of Natural Selection, Random Drift,
and Mutation: Probability, Causation, and Modality I
Luciana Garbayo: Time in biology: An analytic critique and a possible world semantics approach to the temporal structure of living being
Chris Jenson: The case for a frequentist interpretation of fitness
Marshall Abrams*: The unity of fitness
Chair : Marshall Abrams*
XII.7 Philosophies of Science in the 18th Century: Critical
Reflections on Methodologies for the Biological Sciences
John Zammito: Kant and the challenges of naturalism
Marcel Quarfood: Kant’s shifting attitude towards Naturgeschichte and Girtanner ́s synthesis Joan Steigerwald*: Instrumental reasoning in the 18th century
XII.6 History of Biomedicine
Peter Chalk 1.3
Peter Chalk 1.5
Maria Strecht Almeida: The erythrocyte as a model, or the virtue of lacking DNA: An account on the material culture of cell aging and apoptosis studies
Joao Nunes: The making of a pathogen: The early biography of Helicobacter Pylori Maxi Stadler: Quantifying excitable tissues in the 1930s
Chair: Maria Strecht Almeida
Chair: Phillip Hunemann
Peter Chalk 1.6 Peter Chalk 2.6
XII.8 Ecology I
Yin Gao, William Herfel: Post-classical ecology: On the emerging dynamic perspective from self-organizing complex adaptive systems
Rachel Bryant: “Invasive” species and the diversity-stability hypothesis Gregory Cooper: In search of community ecology
Chair: Rachel Bryant
Peter Chalk 2.5
Peter Chalk 2.3 Peter Chalk 2.1
XII.9 Mathematics in Biology
Giovanni Boniolo: Mathematical models and biology: A philosophical analysis Trin Turner, Tom Schenk, Jr.: Bayes is the New Black: Agent-Based Modeling and Bayesian Inferences in Biology
Jukka Tienari: An algebraic model for teaching theoretical biology
Aidan Lyon: Probability in Evolutionary Theory Chair: Aidan Lyon
XII.10 Foundations For A Genuinely ‘Cognitive’ Biology
Pamela Lyon*, Jon Opie: Prolegomena for a cognitive biology Fred Keijzer: Animality: Where biological cognition might start Daan Franken: What do nervous systems do?
Chair : Carl Craver
XII.11 Pragmatism and Evolutionary Biology
David Depew: Dewey’s Darwinism and the Baldwin effect
Tibor Solymosi: From the principles of psychology to dynamic systems:
The influence of Darwin on James, Dewey, and cognitive neurobiology
Judy Johns Schloegel: What does a pragmatist genetics look like? Herbert Spencer
11:00-12:30
Jennings and the politics of evolution and heredity
Mark Tschaepe*: Gospel of Greed: Peirce’s misreading of Darwin Chair: Mark Tschaepe*
Sunday Session XIII
XIII.1 Complexity Management in Biology: Philosophical and Sociological Views II
James Griesemer: What Simon should have said
Chair : Vivette García*; Commentator: Peter Taylor
XIII.2 Mechanisms, Functions, Organization, and Emergence:
New Perspectives on Reductionism II (roundtable)
Marcel Weber, Manfred Laubichler, Sahotra Sarkar, Jason Scott Robert, Colin Allen, Alan Love, Paul Griffiths, Karola Stotz, William Bechtel, Bob Richardson, Mark Bedau, Todd Grantham
Chair: Karola Stotz*
XIII.4 Essentialism and classification II
Peter Chalk 1.1
Peter Chalk 1.3 Peter Chalk 2.6
Peter Chalk 2.3
Peter Chalk 1.7
Mathias Brochhausen, Ulf Schwarz: Species essentialism without attributes: Processes, patterns and biological ontologies
Katie Kendig: The ontology of race
Arun Saldanha: Thinking populations through Deleuze
Charissa Varma: Axiomatizing the tree of life: The impact of logic on biological taxonomy in the early 20th century
Chair : John Wilkins*
XIII.5 Metaphysics of Natural Selection, Random Drift, and Mutation: Probability, Causation, and Modality II
Denis Walsh: Fitness, discreteness and compositionality
Matthew Dunn: Two requirements for the concept of genetic drift Chair: Marshall Abrams*
XIII.8 Ecology II
Eric Desjardins: History Dependence in Ecology
Toben Lafrancois: Taxonomic resolution in ecology: How species concepts produce a plurality of ecological models
Clement Loo: What is natural?
Chair : Eric Desjardins
XIII.10 Evolutionary Epistemology
Davide Vecchi: Two challenges for evolutionary epistemologies based on selection theory Paola Hernandez Chavez: Reductionism in some naturalized epistemologies,
or why philosophy matters
Heather Perez: Kornblith on knowledge: Reliability, then or now?
Chair: Paola Hernandez Chavez
XIII.11 Premodern Biology
Jamie Feldman: The primacy of the heart in Aristotle’s biology and psychology Sylvène Renoud: The relationships between text and images in microscopy of insects in the 17th century: The example of Swammerdam
Maria Elice Brzezinski Prestes: The emergence of themes of research in the epistolar relation between Lazzaro Spallanzani and Charles Bonnet
Stéphane Tirard: Spontaneous generations, beginning of life and history of life in Lamarck’s theory
Chair : Stéphane Tirard
Newman B Newman C
Abstracts
Session: III.4 Room: Peter Chalk 1.1 Time: Thursday 16:30-18:00
that fitness might sometimes be defined in terms of geometric mean number of offspring, or a linear combination of the mean and variance of number of offspring, or some other function (Beatty & Finsen 1989, Brandon 1990, Sober 2001). While Brandon (1990) argued that fitness therefore merely satisfies a common schema instantiated by different mathematical functions, Ariew & Ernst (2007) have gone further, arguing that Gillespie's work shows that no coherent definition of fitness is possible.
Similar conclusions have been drawn from arguments that fitness must sometimes be characterized by an even wider variety of mathematical functions because of conspecifics' mutual influence on reproductive success (Ariew & Lewontin 2004, Krimbas 2004). For example, different functions might be needed to deal with sexual vs. asexual reproduction, frequency-dependent and density-dependent fitness, maternal effects, and some kinds of niche construction.
Despite the heterogeneity of mathematical functions needed to model fitness, I argue that fitness is nevertheless a common property of types in populations, and that: (1) It's plausible that fitness is constituted by one very complex, parameterized, mathematical function of probabilities, numbers of descendants, and other factors, of which different mathematical functions are specializations. (2) Whether or not (1) is correct, the fact that fitness involves different functions in different contexts is not in itself problematic, but is merely an extension of the common idea that fitness is determined by environment. (3) Though fitness must sometimes be defined in terms of probabilities of reproductive effects over several generations, this does not mean that fitness does not have to do with influence in each generation. Since probabilities of long-term effects can be derived from probabilities of short-term effects, the former are simply mathematical properties of causes acting in the short term. This removes a motivation for Brandon's schema account of fitness.
Pnina Abir-Am
HBI-Brandeis University, Waltham, MA., United States
DNA at 40: The impact of biotech on collective memory in molecular biology. (1993, 1953)
The 40th anniversary of the discovery of DNA structure in 1993 was marked by half a dozen conferences, held in the US and Europe. Despite their relatively large scale, those forums remained limited to an elite of academic, industrial, and governmental scientist leaders. This paper will focus on comparing the role of industrial and governmental biotech scientists at an international conference held in Paris under the sponsorship of UNESCO; and a conference held in Chicago, under the sponsorship of local universities and their partner institutions elsewhere.
Both events produced large volumes including numerous memories, recollections and related contributions reflecting on the relationship between the discovery of DNA structure in 1953 and the scientific frontier in 1993. By then, the scientific frontier was marked by the intertwined fortunes of science with the biotech industry and large scale governmental “big science” initiatives, such as the Human Genome Projects.
The paper explores how a then new commercial and industrial context of biotech produced a new range of commemorative sensibilities among molecular biologists. Not only was the scale of the commemorative occasions much larger than ever before, but the agenda shifted largely, if not entirely, to issues of intellectual property, and the delicate balancing of research, commercialization, and regulation.
Session: XII.5 Room: Peter Chalk 1.3 Time: Sunday 9:00-10:30
Marshall Abrams
University of Alabama at Birmingham, Birmingham, AL, United States
Session: I.9 Room: Peter Chalk 2.5 Time: Thursday 11:30-13:00
The Unity of Fitness
According to the original version of the propensity interpretation of fitness, fitness is a mathematical function of probabilities and numerical values associated with reproductive outcomes. In particular, the function was thought to be the expected or arithmetic mean number of offspring. In response to work by Gillespie in the 70's, some authors have argued
University of Brasília, Brasília, DF, Brazil
Paulo Abrantes
Mindreading and Evolution
It has often been pointed out that evolutionary biology is a field where a fruitful exchange between scientists and philosophers took place around several foundational problems. I want to focus here on the evolution of the
human mind. Might a philosophical standpoint contribute to deal with this topic? It became widely accepted that the social environment drove the evolution of distinctive cognitive skills in the hominid lineage. A central issue here is how to describe those skills. Philosophers are usually prone to giving a prominent role to common sense, in this case to folk psychological notions, in depicting the human mind. From this standpoint, the human mind is different from other primate minds not just in its representational capabilities but, moreover, in its metarepresentational skills, especially when using the latter for mindreading. Those skills would be at the basis of a cumulative culture and of other features that distinguish us even from our closest evolutionary kins. Despite the pioneering work of psychologists and primatologists on mindreading capabilities among primates, the questionable credentials of folk psychology - either for providing an accurate description of our mind states and processes, or as an effective tool in behavioral prediction -, might explain why folk notions don't play a prominent role (if any at all) in scientific accounts of human evolution, narrowly-conceived. The situation is, actually, less clearcut, even if we limit ourselves to philosophy. The status of folk psychology has been a hot-debated topic in the philosophy of psychology, and there are also various conflicting theories on how mindreading is accomplished. We have philosophers attempting to reduce folk psychological notions to well- behaved 'naturalistic' ones, and others that are skeptical towards the whole enterprise. Eliminativism also looms large in some quarters. Even among those that are commited to the irreducibility of folk psychological accounts, there are conflicting positions about their relationship to scientific accounts of the ontogeny and the philogeny of the human mind. We can find those that argue for a full autonomy of a commonsense account vis à vis a scientific account, and those that look, instead, for an integration. Besides charting the territory, I would like to evaluate the prospects of integrative approaches in setting up evolutionary scenarios in which the architecture of the mind interacts with a widespread mindreading folk- practice, taken as a separate trait of the relevant selective environment. It is a further issue whether our interpretive capacities should be considered an adaptation and if the complexity of a social environment is able to account for it.
Session: VIII.4 Room: Peter Chalk 1.1 Time: Saturday 9:00-10:30
Salim Al-Gailani
University of Cambridge, Cambridge, United Kingdom
Monstrosities and medical men: obstetric encounters with teratology in Britain, 1850-1900
In 1904, the Edinburgh obstetrician, John William Ballantyne (1861-1923), published his ‘Manual of Ante- natal Pathology and Hygiene: The Embryo’: the most substantial treatise on human teratology yet published in English. But why should an obstetrician consider tera- tology, a field to that date largely contained within the disciplines of comparative zoology and experimental embryology, as a legitimate area of clinical expertise?
This paper investigates the emergence of the embryo as a concern of British obstetrics and gynaecology before 1900, an area of inquiry hitherto largely neglected by historians of nineteenth-century medicine. Examining obstetric and midwifery manuals, advice books, as well as the medical press, proceedings of society meetings and lecture notes, I ask how British obstetricians came to value an under- standing of laws of prenatal development.
Clinicians’ arguments for and against the theory of maternal impressions as a cause of fetal anomalies, which resurfaced in the second half of the nineteenth century, elucidate the awakening of the medical imagination to the laws of embryonic and fetal development, and heredity in this period. I use case reports discussing the validity of imaginationism to reconstruct dialogues between patients and practitioners concerning the stages in pregnancy, and assess how testimony from medical and lay sources was manipulated in the reporting procedure.
The appropriation of teratology and embryology will be considered in the context of the politics of obstetrics and gynaecology. In particular, I consider whether this knowledge was used as a means of claiming professional authority over other practitioners, especially midwives, as well as lay sources of advice to pregnant women during this formative period in British obstetrics.
Session: X.8 Room: Peter Chalk 2.6 Time: Saturday 14:00-15:30
Douglas Allchin
Univ. oF Minnesota, Minneapolis, MN, United States
Socializing Epistemics: Resolving the Ox-Phos Debate
Mitchell, Racker, Chance, and Slater; Boyer, Ernster, candlestick-maker?: so many biochemists at odds about energy in the cell. They eventually all accepted chemios- motic theory, but all for different reasons. I will sketch the epistemic structure of the distributed, local responses and frame strategies for scientific practice.
Session: X.2 Room: Newman C Time: Saturday 14:00-15:30
Session: XII.6 Room: Peter Chalk 1.5 Time: Sunday 9:00-10:30
Colin Allen
Maria Strecht Almeida
Indiana University, Bloomington, IN, United States
Abel Salazar Institute of Biomedical Sciences, University of Porto, Porto, Portugal
Developing Theories of Imitation
The neurobiological basis of behavioral imitation seems plausibly located in the so-called “mirror neurons” of F5 premotor cortex, first discovered in rhesus monkeys. Yet this raises a puzzle, because the general consensus among cognitive ethologists and comparative psychologists is that the capacity for imitation is not present in adult monkeys. A recent report of neonatal imitation in rhesus macaques, appearing only during a window of 3-10 days after birth, further extends the puzzle of why monkey mirror neurons don’t appear to support adult imitation. However, a report of enhanced imitative capacities in human-reared Japanese macaques provides a tantalizing hint of the role that developmental issues may play in experimental results. Debates about imitation in nonhuman animals have been characterized by arguments about definitions of imitation, but I will argue that a combined neural and developmental perspective will ultimately be more fruitful. I suggest that this lesson generalizes to many of the debates between cognitive ethologists and comparative psychologists, where developmental issues have largely been ignored
The Erythrocyte as a Model, or the Virtue of Lacking DNA: An Account on the Material Culture of Cell Aging and Apoptosis Studies
A few years ago (in 2001), an editorial in Cell Death and Differentiation drew readers’ attention to the fact that mammalian erythrocytes, which (as is well known) are cells devoid of nucleus, mitochondria, as well as other organelles, were “intriguing researchers interested in apoptosis”. Briefly, the assertion was that these cells, whose described features were first seen as indicative of their inability to undergo programmed cell death, and particularly apoptosis, emerged afterwards as a suitable experimental model for investigating that genetically regulated process (or at least some aspects of that process).
The first reports on the definite life span of human erythrocytes were published in the early twentieth century. Throughout the last century, the aging process of these cells has mobilized several and diversified groups of researchers, the 1970s and 1980s being the period of the uppermost scientific output measured in number of published papers. Many of these studies engaged with the idea of accumulation of injuries during the life span, or focused on the identification of a specific entity on the outer side of the cell that would be responsible for the removal of old erythrocytes from circulation. There seems to be a general agreement that the simplicity of this cell, in conjunction with its oddness of lacking DNA, and also the machinery of protein synthesis, was an advantage in view of the experimental model, providing a setting to study phenomena of (true) cell damage and aging due to the scarce repair mechanisms scenario. We can easily find that consensus in the scientific literature. One should then take a closer look at how this “aging erythrocyte” – which shared several characteristics with the apoptotic cell but could not undergo apoptosis – entered the (late twentieth century) domain of programmed cell death research and has been portrayed as apoptotic. This turn seems worthwhile to explore.
In this paper, we will examine both the material culture of the laboratory and the interpretations (and reinter- pretations) of the experimental data transforming the epistemic framework in this area of biological/biomedical research; inevitably, the study of the biography of this experimental model entails the discussion on the renewed emphasis on systemic approaches in biology.
Session: II.1 Room: Newman B Time: Thursday 14:30-16:00
Garland Allen
Washington University, St. Louis, MO, United States
Hans Driesch, Rebel with Two Causes
Hans Driesch was a major figure in late 19th and early 20th century embryology and philosophy. Intellectually, he was a rebellious spirit who championed two unpopular causes during his career. In the 1890s he became a leading advocate of the new experimental embryology (Entwicklungsmechanik) pioneered by Wilhelm Roux, and for two decades vociferously espoused the cause of experimentalism and the mechanistic approach to biology. After 1910, however, he abandoned all experimental work and adopted an idealistic/metaphysical approach to biology in general and embryology in particular, codified in the notion of a vital force. How and why Driesch championed these two opposing positions will be the subject of the talk.
Session: VI.4 Room: Peter Chalk 1.1 Time: Friday 14:30-16:00
significance for the issue at hand will be elaborated. It will be suggested that such notion should be part of the development of any applied (social) ecology.
Carlo Marcello Almeyra
UNAM, Mexico City, Mexico
Public Participation in the Mutual Conformation of Science, Technology and Society as a Problem for Applied Ecology
In the search for strategies leading to the development of technology which can contribute to better societies, and in particular contribute to the development of more democratic decision frameworks, there are several issues one can consider. How can we create a broad consensus on the direction of scientific and technological (C&T) developments? How can we promote the expression of democratic values?
How we understand the relation between public policies and C&T, in an specific context, depends on how
we characterize what are C&T in that context, how they are produced, what are the consequences of such C&T changes, with which intention are C&T produced, what type of social and economic development is expected, and, more generally, what are the relations between science and society. In this paper two aspects of this mutually conforming process involving C&T and society, are emphasized: Historical dimensions of the public policies related to C&T (which are part of the relations between scientific practice and society) and how this is reflected in the formation of public policies. Our objective is, by means of an analysis of some examples, to suggest ways in which C&T can participate in the broadening and experimenting with new frameworks leading to decisions involving (potential) developments or evaluations of C&T. Four case studies will be briefly considered. The first three cases provide a framework of possibilities which will allow us to discuss more in depth the fourth case. The first case concerns the proposal to raise genetically modified salmons in Norway. The second case is about ground contamination in Aspen, Colorado. The third case is about air pollution in Denver, Colorado. We want to suggest that in all of these cases, what we associate with a successful participation is related with the development and deployment of specific institutional (or at least organizational) frameworks, as opposed to a mere consensus or balance of interests. The case analyzed more in depth is the project to enlarge the extension and the activities of a salt industry in Guerrero Negro, Baja California Sur. A comparison between the first three cases and the fourth one will allow us to draw conclusions about the importance of developing the above mentioned institutional (or organizational) frameworks which allow the situating of a decision process in the context of “converging communities”
Session: VIII.5 Room: Peter Chalk 1.3 Time: Saturday 9:00-10:30
Kevin Amidon
Iowa State University, Ames, United States
“Do You Know Rosenberg’s Address?”:
Adolf Meyer-Abich and Biological Holism Work Toward the Führer
The historiography of the position of the biological sciences in the period of transition to Nazi rule in Germany remains unstable. It sits uncomfortably between theoretical positions drawn from work on the physical sciences and the widely varied approaches taken to the relationships between the promulgation of race theories and the perpetration of the Holocaust. Exploration of Adolf Meyer-Abich’s practice as biologist and advocate of holism offers the possibility of a differentiated reading of the development of the use and misuse of ‘biological’ concepts and arguments in the transition to Nazi policy. Meyer-Abich’s numerous attempts before, during, and after the Nazi period to develop his holistic biological philosophy into an institutionalized element of academic, medical, and political action show us a scholar whose scientific practice was not primarily investigative, but rather persuasive. His theorization of biological holism as persuasive practice thus put him in a unique position to engage in ‘work toward the Führer’ after 1933. In collaboration with colleagues including Karl Kötschau, Pascual Jordan, and Joachim Mrugowsky, he navigated his holism (under further rubrics including ‘Neue deutsche Heilkunde,’ ‘biologische Medizin,’ and ‘ganzheitliche’ or ‘synthetische Weltanschauung’) between the favor and disfavor of diverse Nazi political and administrative authorities—and escaped the war with his person, his reputation, and his academic position intact. This paper will discuss these issues through focus on the most public of the controversies generated by Meyer-Abich’s holism during the first years of the Nazi regime: his failed appointment in 1936 to a professorship of biology in Jena, a controversy that he came to call the Jenaer Akademiestreit.
The notion of “converging communities” and its
Session: XII.4 Room: Peter Chalk 1.1 Time: Sunday 9:00-10:30
the argument and interpreted its significance in a 1986 lecture delivered at the University of Pennsylvania, which had lain unpublished (but not unread) until the 2005 anthology of his papers entitled The Epistemology of Development, Evolution, and Genetics. This is one of dozens of examples of Dick Burian out-researching his colleagues, but generously sharing his results before they were published. I made a very big thing of the developmental paradox in my 2005 book The Changing Role of the Embryo in Evolutionary Thought. In this paper I will review the history of the paradox, Burian’s contribution to its understanding, and argue that the paradox may give us a unique perspective on the future development of evo-devo.
Yuichi Amitani
University of British Columbia, Vancouver, BC, Canada
The role of “good species” in the species problem
Although the species problem has a long and winding history, it still has a couple of aspects which have attracted relatively little attention. One of such aspects is an explanatory role of species. Most species concepts are supposed to play some role in biological explanations, and they also define species as populations (in a broad sense) with particular properties. However, when a species concept explains its explanandum (which may differ from one species concept to another), biologists give the explanation by citing particular properties of the populations, but not by citing their specieshood. For example, the biological species concept (BSC) explains the coexistence of different species in one habitat, but its explanation is done not by saying that reproductively isolated populations are species, but by citing properties of such populations. This suggests that the notion of species may be originally introduced not to explain something, but to be explained in biological explanations. Then what is the notion of species as explanandum? One possible answer is discontinuity (whatever it means) in nature, which in turn is prima facie examples of species (or “good” species), such as Drosophila melanogaster and Homo sapiens. Those good species are the object of explanatory projects which species concepts attempt to do, and they seem to place a conceptual constraint upon species concepts in that, for any species concept x, if x does not give specieshood to most of good species, then it gives a (prima facie) reason to reject x. In this presentation, I will show how the notion of good species involves the species controversy and argue that it may offer a key to solve some puzzles in the species problem.
Session: I.7 Room: Peter Chalk 1.6 Time: Thursday 11:30-13:00
Ron Amundson
University of Hawaii, Hawaii, United States
Burian’s Paradox and the future of evo-devo
In studying the history of developmental biology and its interactions with evolutionary biology, I “rediscovered” a particular argument for the centrality of development to evolution that predated the birth of the Evolutionary Synthesis. Following Jan Sapp’s Beyond the Gene (1987) I came to call it the “developmental paradox.” I later learned that Dick Burian had also unearthed this argument, calling it “Lillie’s Paradox.” Burian described
Holly Andersen
University of Pittsburgh, Pittsburgh, Pennsylvania, United States
Session: X.1 Room: Newman B Time: Saturday 14:00-15:30
Contemporary Automatism: The Reification of ‘Willing’
Contemporary discussions of the role of consciousness in human agency are considerably influenced by the work of Benjamin Libet on the timing of conscious volition relative to the initiation of bodily movement. Other scientists, such as Patrick Haggard, have sought to extend Libet’s methodology to investigate the relative timing of further features of human will. Others have converted the scientific results into strong philosophical claims about the epiphenomenality of conscious agency: while he is not the only one to do so, Daniel Wegner and his Theory of Apparent Mental Causation is a prototype of this position. I present evidence that underlying these scientific and philosophical claims is a unified understanding of the object of investigation: namely, that individual acts of will, sometimes even called ‘willings’, are being timed and being shown to be inefficacious in initiating volitional movement. Reports by subjects of initiating movement are even labeled ‘W’ judgments. The authors treat these acts of willing almost like the mental equivalent of winking – short, individuatable acts which can be separated from other features of conscious experience, and that function like flexing some mental muscle in order to do something. Because these acts of will are reported by subjects to begin several hundred milliseconds after the ramping up of the readiness potential associated with movement. This understanding, I contend, is an inappropriate reification of the philosophical notion of will. Will is not a separate stage in the chain of events leading from mental intentions to physical movement, the causal role of which can be isolated and tested. Instead, will should be understood along the lines of Cartwright’s example of
‘work.’ Work is not something one does, in addition to all the other details of activities in which one engaged; it is a name for all of those activities. Similarly, will is not something over and above the other features of agency at work when subjects listen to, agree to follow the instructions, and then implement them in Libet- style experiments; will just is the name to collect those other activities together. Given this inappropriate reification, it is of little surprise that these supposed willings are found to have no influence on the initiation of movement.
Time: Thursday 14:30-16:00
What Lies Beyond Babel? Lessons from the Worm Project
This paper investigates the pre-history of bioontologies as developed in the case of the nematode Caenorhabditis elegans (C. elegans), culminating in the construction of Wormbase in 2000. I argue that the foundational assumptions underlying the research pursued and the communication mechanisms established within the community of researchers in the worm project parallel the philosophy behind bioontologies. However, it is less clear that the integration sought in this process has resulted in theoretical unification. Instead, I show how the ‘descriptive unification’ which has occurred thus far can still be viewed as having been fundamental to the unification of diverse fields of research within the biomedical sciences.
a mechanist yet non-reductionist frame for the study of living systems. Needham’s conceptions have nearly vanished from the scientific discourse. However, claims concerning the limitation of the test tube appear to this day in leading biochemical journals. Like Needham in the 1920s, scientists today employ physical principles in their rejection of physical reduction in biology.
Session: XII.1 Room: Newman B Time: Sunday 9:00-10:30
Alfonso Arroyo Santos
Session: II.4 Room: Peter Chalk 1.1
Instituto de Investigaciones Filosóficas, UNAM, Mexico City, Mexico
Rachel Ankeny
Systems Biology and the Limits of Human Cognition
Recent years have witnessed the surge of systems biology and a series of new biomedical fields kept together under the label OMICS. The novelty of these fields relies in its ability to produce huge amounts of data and to present organic processes in their full complexity and not in small pieces as has presumably been done until now. But before any of these disciplines is able to unravel the mysteries of biology, first it will paradoxically have to find ways to reduce such complexity: the sheer amount of data makes it difficult to really grasp the workings of a multivariable mechanism or to understand the actual logic behind a certain process. So the epistemological problem is: If the promise of these fields is to view the whole picture, how accurate it will be if the film has to be sliced into pieces to understand it? I will present some mathematical models used to solve this problem but ultimately, I will argue that the answer has nothing to do with computer power and number crunching but with developing new conceptual tools that will help us understand large pieces of the world and to keep the original picture as authentic as possible. As illustration, I will compare the mathematical discussion to the “self” model as understood in contemporary immunology.
University of Adelaide, Adelaide, Australia
Session: IX.5 Room: Peter Chalk 1.3 Time: Saturday 11:00-12:30
Rony Armon
Bar-Ilan University, Tel Aviv, Israel
Mechanism without Reductionism: Colloid Chemistry and the Mechanist Conception of Life
Typical studies in molecular biology are test-tube reconstitutions of cellular functions. Only in the test tube can scientists isolate biochemical reactions and characterize their components. It is claimed that if we know the entire set of cellular components and their structures we can predict the function of the larger collection. Though common, this view has been challenged since the early history of biochemical research. This talk will discuss claims made in the 1920s by the developmental biochemist Joseph Needham. Needham borrowed conceptions from colloid chemistry to sustain his claims that the cell and the test-tube represent different material realities. His biochemical model of embryonic development suggested
Session: VI.9 Room: Peter Chalk 2.5 Time: Friday 14:30-16:00
Thierry Artzner
University of Chicago, Chicago, Illinois, United States
Political science at the Ecole Libre des Sciences Politiques (Paris: 1870-1890)
The elaboration of a scientific knowledge of politics and the possibility of scientific politics were central concerns to the political thinkers of most denominations in the nineteenth century. The creation of the Ecole Libre des Sciences Politiques by liberals in January 1872 offers a distinctive example of the relation between science and politics in the first decades of the Third Republic. The
creators of the school claimed that they were studying political science as one studies zoology: “nous faisons des sciences politiques comme on fait des sciences zoologiques”. The aim of this paper is to explore the meaning and the implications of relating political science to the biological sciences.
Before the creation of this school, the aspirations to develop a science of politics were characterised by their diversity. However, in the first two or three decades of the Third Republic, French political science became inextricably linked (and still is) to a single Parisian institution: the Ecole Libre. Consequently, the meaning of political science was gradually reduced and determined by the scientific practice of this school. This paper therefore examines the practice of political science at the Ecole Libre: what form political science as a discipline took; how political science could be used to educate politicians; how it could be applied to contemporary politics?
In particular, I claim that the Ecole Libre shifted the epistemological foundations of political science : whilst political science had previously taken mathematics and physics as a their model, the political scientists of the Ecole Libre took biology as theirs. This modification of the scientific paradigm of political science can be explained by the general evolution of science in the second half of the nineteenth century, in particular the popularization of biology (exemplified by Louis Pasteur), of geology and the emphasis on experimentation. But the focus of this paper is on the actual references and links to biology that the political scientists of the Ecole Libre tried to established.
If the political scientists of the Ecole Libre thought that there could be a science of politics, it was because they argued that political science should be chiefly concerned with history. Yet the scientific character of history, in their eyes, was grounded upon the very peculiar idea that biology and geology (and sometimes chemistry) were nothing else than the study of a past, differing with history only in that the latter dealt with a more recent past.
This epistemological relation of political science to the biological sciences was, I argue, central to the school’s outlook on politics, especially the importance of grounding political action in a certain type of detailed knowledge of the past and, more importantly, in pragmatism. Thus, I argue that the school’s conception of political science led to the impoverishment of some fundamental concepts of liberalism and provided some of the intellectual bases of what we call technocracy.
Combining the history of political ideas with the history of scientific ideas therefore provides insight into the importance of some peculiar conceptions of history and biology in the intellectual elaboration of technocracy.
Session: II.4 Room: Peter Chalk 1.1 Time: Thursday 14:30-16:00
Michael Ashburner1,3
1) University of Cambridge, Cambridge, United Kingdom, 2) European Bioinformatics Institute, Cambridge, United Kingdom
The application of ontologies in the biomedical domain
Approaching any biomedical scientist in, say 1997, to discuss ontologies would have drawn a blank, and puzzled, response. Today, such an approach would be far more fruitful. When a small group of us started the Gene Ontology in 1998 we were blissfully unaware of the intellectual history of this field, in either philosphy or in artificial intellegence. Our effort was driven, not from any abstract desire to construct a theory of biological knowledge, but to satisfy the pragmatic needs of Model Organism Databases. Model Organism Databases are themselves a response to the growth in scientific knowledge, in particular to knowledge from genome and other “high throughput” projects. There are now databases for most of the major organisms used in biological research, from bacteria to mice (though not, for a variety of political reasons, for humans). The Gene Ontology, and many biomedical ontologies that it has inspired, were developed to provide common standards for these databases. In the particular case of the Gene Ontology these are standards to describe the “function” of gene products (proteins and non-translated RNAs). Our vision was to develop a standard that could be shared by all Model Organism Databases. The advantages of this effort were seen to be two-fold: if all databases shared the Gene Ontology then we would, de facto, achieve a degree of integration across these diverse resources; secondly, since each database needed to annotate its gene products with terms relevant to their “function”, by collaborating we could share the work in building a vocabulary of these terms. I will argue that this pragmatic approach to ontology development is the only one which should interest us. Ontologies are artificial constructs of little theoretical value, but immense practical advantage. One of the first ontologies developed in the biomedical domain had the term “being” as its root. Such philosphical purity has no place in practical ontology development. This is not to say that the ontologies we construct should not be rigorous, they must be so, but only because they need to be manipulated by computer programs that are blind to the subtleties of the human mind.
Session: X.10 Room: Peter Chalk 2.3 Time: Saturday 14:00-15:30
the paradigmatic role of the life sciences in Foucault’s early archaeology and to his later theory of “biopower” and “biopolitics”.
Ugo Balzeretti
University of Basel, Basel, Germany
Michel Foucault and Georges Canguilhem. Biopolitics and the attempt at a new biological foundation of the human sciences
The current significance of Georges Canguilhem’s (Castelnaudry 1904- Paris 1995) historical epistemology of the biomedical and life sciences is not only attributable to “the centre stage in the scientific and social arena” of the biological issues today, as Paul Rabinow put it. That Canguilhem’s highly specialized approach could find in the Paris of the sixties and seventies the attention of such sociologists, psychoanalysts and philosophers as Robert Passeron and Pierre Bourdieu, Jacques Lacan, Louis Althusser, Gilles Deleuze and Jacques Derrida can only be explained by the far-reaching impact of the philosophical questions he deals with.
Behind his interest in specific issues of biological epistemology – e.g. the distinction between the normal and the pathological in relation to the epistemological status of the biomedical sciences, a neo-vitalist conception of knowledge as knowledge of life by life, the history of the reflex reaction’s concept – stands the broad project not just of a philosophy of biology but of a “biological philosophy”. This undertaking was an attempt to rethink and actualise Auguste Comte’s non-psychological foundation of the human sciences in a neo-vitalist way.
Especially Canguilhem’s general philosophical project draws the attention of Michel Foucault, whose work is widely marked by a continuous confrontation with Canguilhem’s epistemology of the life sciences. Foucault sees in the latter the capacity to take up one of the most important events in the history of modern philosophy: the shift with Kant’s Critique of Judgement and Hegel’s Phenomenology of Spirit from the Cartesian question of the relations between subject and truth to the question of the relations between truth and life. Canguilhem’s non-psychological neo-vitalism is seen as answering the task of reformulating the “entire theory of the subject”, that this shift involves.
My exposition will show in a first step that Canguilhem’s main undertaking doesn’t imply the renunciation of subjectivity of the kind Foucault is proposing. Canguilhem’s spinozistic philosophy of concept doesn’t – as Foucault claims – entail a radical critic of subjectivation. In a second step I will measure the consequences of this clarification for Foucault’s own understanding of the life sciences and of the relations between science and technique, knowledge and power, biology and politics. Thereby attention will be paid to
Anouk Barberousse
Session: IX.3 Room: Newman D Time: Saturday 11:00-12:30
IHPST, Paris, France
Developmental Systems Theory and the Neo-Darwinian Theory of Evolution
Developmental Systems Theory or DST is often presented as a broad theoretical approach aiming at explaining developmental and evolutionary phenomena. One of its main goals is to extend the domain of the neo- Darwinian theory of evolution and include phenomena of extra-genetic heredity. DST advocates describe it sometimes as an actual scientific theory, sometimes as a broader enterprise, namely a kind of biological Weltanschauung which would be devoid of the conceptual problems they detect in today’s biology.
The aim of this talk is first to clarify DST’s main theoretical tenets, then to assess their interest and potential fruitfulness for empirical research programmes. In order to identify and clarify DST’s main claims as parts of a truly theoretical enterprise, it is first necessary to distinguish its positive claims from its mostly critical claims, directed against other conceptions of evolution, like Dawkins’ or Kitcher’s. This task is not an easy one, since DST’s motivation has been first to show that some parts of biology and psychology, like behavioral genetics, rest on unacceptable, or even incoherent foundations. DST proponents’ main target is the dichotomous conception of development in which a clear cut distinction is made between innate and acquired elements. I argue that these vigorous criticisms cannot be separated from DST’s positive tenets about development and evolution. I shall propose a presentation of DST whose aim is to be as coherent as possible in spite of its proponents’ divergences. I take it that such a reconstruction has to be based on an elaboration of two fundamental concepts, namely the concepts of evolutionary unit and of developmental unit. DST’s conception of heredity emerges from the articulation between these two renewed concepts.
I shall then examine the question whether an extended theory of evolution – which is a way of describing DST’s ambition – can do without the concepts of organism and of replicator. The concept of organism is replaced in DST by that of a developmental system or life cycle, and the concept of replicator is forcefully criticized, as well as the distinction between replicator and interactor. In other versions of evolutionary theory, either the concept
of replicator or that of organism are fundamental concepts. Therefore, the question: ‘How does DST explain reproduction and evolution?’ is an important one. I shall claim that DST lacks an explanatory principle whose role would be similar to the transmission of genetic material or the transmission of genetic information in other versions of evolutionary theory. Reproduction is left partly unexplained, since it is only accounted for by a rather vague appeal to self- organization principles.
Session: II.11 Room: Peter Chalk 2.1 Time: Thurs 14:30-16:00
Abhijeet Bardapurkar
Homi Bhabha Centre for Science Education, Tata Institute of Fundamental Research, Mumbai, India
What is “natural” in natural selection?
Darwin naturalized the process of organic evolutionary change. How did his theory of natural selection achieve this naturalization? Here, I sketch my take on this question by contrasting natural selection with artificial (“man’s”) selection and asking, what is “natural” in natural selection? The theory naturalizes the evolutionary change by first characterizing it as change by selection, in contrast to what I term as change by transformative action, and then naturalizing the selection.
The theory of natural selection does not provide us with the cause of individual change, but tells us how the existing individual changes are naturally preserved. Origin of individual variation is beyond the domain of Darwin’s theory. The theory could distance itself from the question of the origin – origin of change and hence origin of slight adaptation – because, one, it explains the evolution of existing individual changes; and two, it explains the evolution by selection. Darwin could naturalize organic evolutionary change because the explanatory structure of his theory is such that, one, the origin of change – its cause – is irrelevant for his theory, and two, the selector is not necessary for the selection. Usefulness of an individual change to its self causally contributes in its own selection, no natural or artificial external agency is necessary, selection is a natural effect of inherited self-advantage. The evolutionary change by natural preservation and consequent accumulation/augmentation of slight but adaptive individual change is the distinctive mark of Darwin’s theory, as it would be of any theory that would attempt to explain change by selection and accumulation in contrast to change by transformative action. In natural selection changes are not caused, they are just preserved
Session: IX.2 Room: Newman C Time: Saturday 11:00-12:30
Gillian Barker
Bucknell University, Lewisburg, PA, United States
Session: IV.10 Room: Peter Chalk 2.3 Time: Friday 9:30-11:00
How Systems Fail: Function, Malfunction And Dysfunction
The self-regulating systems that support adaptive functions overlap the boundaries of biological objects such as cells, organs and individuals in a variety of ways, in forms that include extended phenotypes and constructed niches. Much of the biological world thus falls into the ‘contested zones’ that are subject to attempted regulation by more than one system. As a result, a self-reconstructing system can fail in two very different ways. The feedback web by which it maintains itself can erode or break down, resulting in malfunction. Or parts of this web may be co-opted by another system at the same or another level of organization, resulting in dysfunction. These different kinds of failure are usually not clearly distinguished conceptually, and they may be difficult to differentiate empirically. The differences between them are consequential, however, for our understanding of why self-regulating systems fail and for our efforts either to incapacitate such systems or to support them when they are at risk. The growth of knowledge in medicine has often resulted in new explanations that move familiar diseases from the category of malfunction to that of dysfunction. Similar considerations may have implications for our understanding of stability and instability in many other areas, including ecology and social organization.
Matthew Barker
University of Wisconsin-Madison, Madison, Wisconsin, United States
Mechanism Range and Natural Selection
Current analyses of ‘mechanism’ seem especially well suited for mechanistic explanations of phenomena at lower (e.g., molecular) levels of biological organization. But authors have suggested that these analyses do not suit putative higher-level mechanisms to which biologists’ appeal, such as natural selection. This is unfortunate because, notoriously, we are in search of greater conceptual clarity on higher-level biological phenomena. An account of ‘mechanism’ that suits higher levels would help advance us beyond certain conceptual issues, towards the relevant empirical ones.
To this end, the two chief innovations of this paper are an analysis of ‘mechanism’ that better suits higher-level phenomena, and clarification of the mechanism of
natural selection. The first innovation affords the second.
My analysis of ‘mechanism’ involves a novel interpretation of the Machamer, Darden and Craver account (MDC) of ‘mechanism’, which in turn allows me to introduce and develop the notion of mechanism range. This notion adds precision to the intuition that, relative to each other, mechanisms can be more or less exacting. MDC implies that a certain mechanism is the kind of mechanism it is (e.g., protein synthesis mechanism) because of its characteristic entities, activities and organization. I reinterpret this so that a mechanism is characterized by certain entity, activity and organization roles. Relative to other mechanisms, a given mechanism is then of wider or narrower range depending on the number of, and relative disparity between, the kinds of entities, activities and organization that can fill its characteristic roles. If relatively many different kinds of entities, activities and organization can satisfy a mechanism’s characteristic roles, and/or there is relative disparity among these different possible role fillers, the mechanism will be a relatively wide range mechanism. Wide range mechanisms are less exacting than narrow range mechanisms.
To show that we can understand at least some higher- level mechanisms as wide range mechanisms, I show how natural selection is a relatively wide range mechanism. Compared with lower-level mechanisms, relatively many and quite disparate kinds of entities, activities and organization can satisfy the characteristic roles of the mechanism of natural selection. Thinking of selection in terms of its characteristic roles affords conceptual clarity that can have empirical consequences: when appealing to selection, one attempts to empirically determine the role fillers in a mechanism of wide range.
consistently dominated by this cluster of notions, and in this anomalous context controversy about the explanatory merits of these notions has been especially intense and starkly expressed. Here too we see particularly clearly how the controversy bears upon our concern with ourselves, our nature as human beings and our place in the natural order. In human behavioural genetics in particular the perceived conflict of determinism with free will and human autonomy, arguably the fundamental polarisation underpinning debates about mechanism and reductionism as well, continues not only to underpin the controversy but to be articulated explicitly as a part of it. Here one can find ideas of free will as undetermined and inexplicable causally being opposed by compatibilist positions wherein freely willed autonomous actions are simply those actions determined by the operation of an individual reason decoupled from external constraint.
This aim of this paper is first to present a quite different compatibilist position and secondly to spell out how that position relates to controversies about biological explanations and their implications. I suggest that the discourse of free will and everyday voluntarism is best understood as an element of an institution of responsible action and has essential functions therein in the ordering of our social life. Such discourse is not primarily concerned either to explain human behaviour or to set bounds on what can be explained. It is not concerned to identify autonomous actions or agents. On the contrary, it is a discourse that in itself has causal power, through the use of which we affect each other and modify each others behaviour. It is a performative discourse, as the paper itself will explain. Its use presumes the vulnerability and susceptibility of those it addresses to the very discourse itself. Once this is recognised the relationship of everyday voluntaristic discourse and the explanatory discourses of the biological sciences has to be understood in an entirely new way, again as the paper itself will explain.
Session: IV.9 Room: Peter Chalk 2.5 Time: Friday 9:30-11:00
Barry Barnes
Exeter University, Exeter, United Kingdom
Session: VII.3 Room: Newman D Time: Friday 16:30-18:00
Mark Barrow
Biological explanations of human actions and the institution of responsible action
In both biology and philosophy of biology, mechanism, atomism, determinism and reductionism have figured prominently among a cluster of explanatory perspectives the value of which is a matter of continuing controversy. Alternative approaches to explanation involving holism and emergence have never lacked defenders in this context and indeed before the recent move to ‘molecularisation’ these approaches were often the predominant ones. Only in genetics, among the biological sciences, has explanation been
Virginia Tech, Blacksburg, VA, United States
On the Trail of the Ivory-bill: Science and the Struggle to Save an Endangered Species
This paper explores the first systematic field study of the ivory-billed woodpecker, a large, charismatic bird that once inhabited forests in the river bottomlands of the southeastern United States. By the 1930s, the ivory- bill was one of several North American species facing extinction due to overhunting, habitat destruction, and competition from exotic species. In response, in the middle of that decade Aldo Leopold called for
naturalists and wildlife officials to begin constructing a “Conservation Inventory of Threatened Species” and the National Audubon Society created a graduate fellowship program to learn more about the life history and ecology of those species. Audubon officials believed that the knowledge produced through this program would lead to policies ensuring the continued survival of native wildlife threatened with extinction.
century public from natural theology to a more mechanistic view of evolutionary biology. Nowadays it is unlikely to have such benefits, while costs may flow from its inhibiting effects on fresh thinking in evolutionary biology. The metaphor implies passivity on the part of the organism. Yet, niche construction, choice of sexual partners and prey, adaptability, and accommodation to new challenges all suggest that organisms play an active role in affecting what happens to their descendants. Furthermore, the blanket use of natural selection implies that the evolution of adaptations and the formation of new species involve the same processes.
The first Audubon-sponsored study examined the ivory- billed woodpecker. During the late 1930s and early 1940s, the Cornell doctoral student James Tanner completed both an extensive field survey of potential ivory-bill habitat (logging more than 45,000 miles on his Model A Ford) and an intensive life-history study of the only ivory-bills he found: a handful of surviving birds in Madison Parish, Louisiana. Based on his research findings about the habitat needs and behavior of the species, he also developed a plan to rescue it. Unfortunately, the 80,000-acre site where Tanner located the elusive species was privately owned, and the value of the timber on the land eventually thwarted all efforts to protect it. Within a few years after Tanner complete his study, the last stand of the ivory-bill was logged, dashing hopes for the species.
The story of Tanner and the ivory-bill provides a valuable window onto changes in the science related to endangered species in the 1930s and the 1940s. Until this point, most research on vanishing wildlife had been piecemeal and haphazard—generally concerned with obtaining specimens and documenting the taxonomy, distribution, and population status of those species. For the first time, a new generation of Ph.D.-trained naturalists began to undertake detailed, systematic life- history and ecological studies of species threatened with extinction based on long-term field work and with an eye toward shaping the policy needed to rescue them. What naturalists and conservationists quickly discovered, however, was while these new forms of scientific knowledge might be necessary to save endangered species, they were not sufficient. Without a larger change in values, implementing the recommendations that Tanner and his colleagues made proved impossible.
Session: VI.2 Room: Newman C Time: Friday 14:30-16:00
William Bechtel
University of California, San Diego, La Jolla, CA, United States
Session: V.10 Room: Peter Chalk 2.3 Time: Friday 11:30-13:00
Delineating the Phenomenon for Electrophysiology: Emil du Bois-Reymond and his students Ludimar Hermann and Julius Bernstein
Often a substantial amount of pioneering work, both in developing instruments and techniques and characterizing and evaluating the results of empirical investigations, is required just to delineate the phenomenon that will become the focus of a field of inquiry. I will focus on how Emil du Bois-Reymond and two of his students helped delineate the action potentials in nerves that became the focus of research in the field of electrophysiology.
The 18th century investigations of Galvani and Volta using combinations of metals to elicit muscle contraction left unsettled whether nerves themselves conducted electricity. Using a new instrument, the galvanometer, which could detect weak electric currents, Carlo Matteucci detected electrical current in injured tissues. Moreover, he found that stacking such tissues had a multiplying effect similar to that achieved by adding bimetallic elements to a Voltaic pile. Taking over from these results, du Bois- Reymond returned to muscle and used a galvanometer to record a reduction (negative variation) in the resting current during the duration of a tetanus. He then sought to establish a similar reduction in the nerves and that it was responsible for the muscle contraction. His student Ludimar Hermann challenged his claim of a resting current, maintaining it was an artefact of injury to nerve and muscle, and proposed a model in which the nerve functioned more like a capacitor than a battery. A chemical change would render the region of the nerve more electronegative, and the negative variation was accordingly a “self-propagating wave of negative charge which advances in steps along the tissue.”
Patrick Bateson
Cambridge University, Cambridge, United Kingdom
Has natural selection outlived its usefulness?
Darwin’s phrase natural selection was not liked by all of his otherwise sympathetic colleagues. Nevertheless, the agent implied by the metaphor may have been responsible for converting many members of the 19th
After Helmholtz developed a technique for measuring the speed of transmission of excitatory potentials in nerves and showed it to be much slower than in metals, the question arose as to whether the excitatory potential measured by Helmholtz was the same as du Bois-Reymond’s negative variation. Another of du Bois- Reymond’s students, Julius Bernstein, created another new device, a differential rheotome, which enabled him to establish the that the speed with which the negative variation moved down the nerve closely approximated Helmholtz’s speed for nerve transmission and thus supported the claim that the negative potential constituted the means of electrical conduction. Du Bois- Reymond was a major advocate of a materialist conception of life, and much later in his career Bernstein helped surmount doubts that electrical transmission involved vital forces by articulating a theory according to which the negative variation was due to permeability of the nerve membrane to potassium ions.
Session: VIII.7 Room: Peter Chalk 1.6 Time: Sat 9:00-10:30
Session: X.9 Room: Peter Chalk 2.5 Time: Sat 14:00-15:30
Intimations Of Natural Selection: Patrick Matthew and Charles Darwin’s Notebooks
In 1831, Patrick Matthew, in an appendix to his book Naval Timber and arboriculture, mentioned that species were not produced by special creation, but could evolve under the action of a ‘force of change’. A law of adaptation to changing circumstances selected the best individuals suited to circumstances, with the help of a natural disposition to vary. There was a ‘selection by laws of nature’.
The resemblance between Matthew’s formula and Darwin’s theory in the Origin of Species is striking, and it would be easy, stressing such similarities, to consider Matthew’s considerations as anticipating Darwin’s views on natural selection and descent with modification – as Darwin himself actually did when Matthew called his attention on his own views. It would be as easy, from a reverse point of view, stressing the differences, to conclude that Matthew’s views on selection did not aim at all to found a theory of evolution, but remained an ‘appendix’ to his considerations about arboriculture and breeding, change of species being only a secondary consequence, supported by common sense, which did not need further demonstration. Matthew did not fully grasp the importance of his own views on ‘selection by law of nature’.
Though Darwin was explicitly in search of a theory of adaptation to circumstances and of laws of generation in his early notebooks, such considerations could apply as well to the few remarks they contain which hint at a possible selection in nature before his reading of Malthus in September 1838. Some passages could suggest that Darwin did think of a natural law of selection, yet he did not actually grasp at the time the full importance of his own stray remarks. The concept of a divine plan ruling the harmony of nature which still persisted, implicitly or sometimes explicitly, in his search, prevented the full development of the concept of natural selection. His rejection of the traditional theory of Creation went together with a typically Victorian reverence for the Creator: a superior will was at work in nature, acting through the means of secondary causes, or scientific laws, leading organic beings towards a greater variety of more complex forms.
This paper will support Isidore Geoffroy-Saint-Hilaire’s considerations about the nature of scientific discovery: “Is it not fair to ascribe to the glory of a discovery to the person who brought it to its final development and brought the proofs, to the person who, with so much
Jenny Beckman
Uppsala University, Uppsala, Sweden
Linnaean Traditions: School Botany and Biological Recording
When the Swedish Museum of Natural History was founded in 1819, its directors appealed to “the ancient love of science” of the Swedish people, representing their country as the “native land of modern natural history”. More than a century later, plant geographer Eric Hultén claimed that love of botany was a “typically Nordic trait”, with reference to the “Linnaean heritage”. And as the tercentenary of Linnaeus’s birth draws near, “Linnaean traditions” turn up all over the place.
In a suspiciously elegant way, the establishment of a science curriculum in Swedish primary and secondary schools coincides with the first modern Linnaean celebrations in 1878. My aim in this paper is to examine school botany and the importance of collecting in the curriculum, and specifically the various handbooks, fieldguides and manuals developed for these purposes – which later develop into adult education and study circle materials used in modern biological surveys. This tangible, often compulsory botanical practice is a way of tracing, and questioning, any alleged Linnaean traditions.
Daniel Becquemont
University Lille 3, Lille FRANCE
genius, was useful, than to the author of a vague and often equivocal idea, in which we catch a glimpse of the germ of a discovery only because another person has fully developed it?”
Time: Thursday 11:30-13:00
Pluralism about emergence in biology
Emergence is an appealing and seemingly natural notion to apply to many biological phenomena, including those in molecular biology. But it is also confusing and controversial. The concept is enmired in disputes about downward causation, subjectivity, and irreducibility, among other things, and there is a zoo of emergence concepts that have been defined and defended in the literature. Although there is a general assumption in this literature that one ought to develop and defend the one proper analysis of emergence, by contrast I will defend pluralism about emergence. This pluralism will be grounded in a general theory that any legitimate conception of emergence must fall under. I will show how a number of different conceptions of emergence are unified by this general theory, especially the three most popular approaches to emergence: what in other publications I have called “nominal” emergence, “weak” emergence, and “strong” emergence. I will then explain how nominal and especially weak emergence have a legitimate and interesting application to molecular biology. I will also show how these two conceptions of emergence defuse all serious worries about the metaphysical legitimacy and epistemological appropriateness of emergence.
Particularly in ecology, the always-present risk is the “fallacy of misplaced concreteness” (Whitehead). What is, for example, the real level of organization under analysis in ecological research on ecosystems? Are ecosystems real entities characterized by specific emergent properties? Or, are they simply useful heuristic hypostases? If they are nothing but fictions, reductionistic approaches are necessary and sufficient. Whereas, if they are real emergent entities, a concrete and productive holistic approach is an inescapable necessity.
Lately, new issues have emerged on the border between ecology and society, and the rise of new ethical questionings appears to be a natural development. In this moving and fruitful scientific, epistemological and ethical domain, some foundational issues have become preeminent objects of study in the philosophy of ecology. Among others, the conservation biology issue, the various scientific and ethical values of biodiversity, the multifarious meanings and practices of sustainable development. In all these areas the holism-reductionism issue plays a central role because it helps to identify the research’s objects and methodological tools, and to individuate define a more the most appropriate epistemological background. Some questionings, like the following, are the clear sign of fundamental and complex issues:
First, what does the holistic conservation biology approach for the preservation of species and ecological systems involve?
Second, what are the boundaries of the “ethical community”? Which option individualistic, or holistic centered is the most adequate to safeguard the integrity of the planet’s biodiversity?
Finally, what does the “holistic” integration of socio- economic and natural processes affirmed in the sustainable development model mean and imply?
Session: I.3 Room: Newman D
Mark Bedau
Reed College, Portland, Oregon, United States
Session: II.8 Room: Peter Chalk 2.6 Time: Thurs 14:30-16:00
Donato Bergandi
Session: VI.10 Room: Peter Chalk 2.3 Time: Friday 14:30-16:00
Muséum National d’Histoire Naturelle, Paris, France
Yann Bertrand1,2
1Södertörn University College, Huddinge, Sweden, 2Muséum National d’Histoire Naturelle, Paris, France
Holism-reductionism debate in ecology, ethics and sustainable development
The holism-reductionism debate is a foundational issue, an epistemological “permanent gravity center”, around which all scientific disciplines revolve. In ecology, this debate has been very useful, pointing out the possible coherence between ontological assumptions specific to some ecological research paradigms (i.e.: ecosystem ecology, landscape ecology, global ecology) and their methodological practices. In fact, it is becoming more and more evident that the proposal of a heuristic holistic worldview does not necessarily reflect real ecological research practices.
On sameness and reference in biological nomenclature
A central claim in phylogenetic nomenclature is that taxon names should be associated with the same clade under different phylogenetic hypotheses. This is not a straightforward claim and it brings the issue of sameness to the fore. What does it mean to be the same clade in different phylogenetic hypotheses? How is taxonomic reference maintained across hypotheses? Here I discuss the differences between real and hypothetical clades and how such a distinction relates
to the sameness problem. Although appealing, the notion of naming real clades is vague. Since hypotheses determine how we perceive things and pursue science we find it important to have a functioning nomenclatural system for hypotheses like clades. As a solution to the sameness problem for such clades we argue that a phylogenetic name does not primarily refer to a single clade that somehow mirror the reality of branches in the tree of life. Instead I suggest that a phylogenetic name does refer to a natural kind of counterfactual and hypothetical clades.
unreliable, then the doctor’s information is unreliable, and when the doctor’s information is unreliable, the doctor will likely prescribe drugs that are either unhelpful or harmful. Thus, the privatization of the biomedical sciences is creating an institutional climate in which individual doctors, through little or no fault of their own, are increasingly prescribing drugs that are harmful to their patients. This is clearly a violation of the principle of nonmaleficence, but it results not – at least in many cases – from ethical lapses on the part of individual doctors, but rather from an ethically problematic institutional environment.
Session: IV.7 Room: Peter Chalk 1.6 Time: Friday 9:30-11:00
Justin Biddle
Session: X.9 Room: Peter Chalk 2.5 Time: Saturday 14:00-15:30
University of Bielefeld, Bielefeld, Germany
Ingrid Birker, Tania Aldred
Redpath Museum/ McGill University
Nonmaleficence And The Privatization Of Biomedical Research
Traditionally, the principle of nonmaleficence in biomedical ethics is understood in terms of the relationship between the physician, on the one hand, and the patient, on the other. The social context of the doctor-patient relationship is rarely emphasized. This relatively individualistic approach, I argue, is misguided. More specifically, I argue that the principle of nonmaleficence can only be followed adequately within certain institutional contexts. I illustrate this point through a discussion of the recent privatization of biomedical research. In the past 25 years, the organization of biomedical research has undergone rapid and profound changes, the result of which is that for-profit corporations have increasing control over the entire research process. Furthermore, it is becoming increasingly clear that these organizational changes are having detrimental effects upon the quality of research. As a result of these changes, epistemic rigor is increasingly being sacrificed at the altar of higher profits. The privatization of biomedical research has important implications for biomedical ethics in general and for the physician-patient relationship in particular. A significant effect of privatization is that doctors are increasingly reliant for their evaluations of the risks and benefits of drugs on industry-sponsored research. When this research is epistemically deficient – for example, when the benefits of drugs are exaggerated or the risks of new treatments understated – then the information that doctors receive is deficient. And although individual doctors could, in theory, investigate the effects of a drug on their own – and thus become less dependent upon pharmaceutical industry studies – they could, at best, only do this for a very limited number of drugs. To a large extent, they must rely upon the information provided to them by the industries funding the research. When this information is
Dawson Teaching Sheets—19th century natural science on cotton
Sir John William Dawson, Principal of McGill University and founder of the Redpath Museum, created and used a variety of intriguing teaching aids to enhance his lectures in “Natural Philosophy” during the last half of the 19th century. This oral presentation examines one suite of specific teaching aids known as the “Dawson Teaching Sheets”—- a collection of 26 hand-painted cotton bed-sheets illustrating fossil plants, animals, tracks, geological strata and archeological tools. These sheets were used to compliment a variety of courses taught in Geology and Zoology, as well as the first university courses designed by Dawson specifically for the post-secondary education of women. We will discuss how the aids were designed, how they correlate to the collections of fossils held at the Museum and their current state of preservation.
Created as early as 1880, the “Dawson Teaching Sheets” were unique and valuable science teaching props that integrate visual stimulation with scientific accuracy. This presentation marks the first time they have been examined, catalogued and described. One teaching sheet depicts Dawson’s reconstruction of the world’s earliest tetrapods (four-legged reptiles). Dawson found these fossils in the coal fields of Joggins, Nova Scotia, and described them in various scientific publications. The frontispiece in the 1863 article Air- breathers of the Coal Period matches the painting on this bed-sheet and served to highlight lectures on the topic.
Session: VI.8 Room: Peter Chalk 2.6 Time: Friday 14:30-16:00
Session: VIII.8 Room: Peter Chalk 2.6 Time: Saturday 9:00-10:30
Helen Blackman
Patrick Blandin
Cardiff University, Cardiff, United Kingdom
Muséum National d’Histoire Naturelle, Paris, France
The Cambridge School of Animal Morphology 1882-1910
The Cambridge School of Animal Morphology was founded in the mid 1870s by the young graduate Francis Maitland Balfour. The school quickly rose to prominence and Balfour acquired an international reputation for his work in evolutionary biology. However it seemed the school might fail after his early death in a climbing accident in 1882.
After Balfour’s death his pupil and friend Adam Sedgwick was appointed as lecturer and given charge of the teaching in animal morphology. Since he was not deemed experienced enough he did not take up Balfour’s chair in animal morphology. Together with Balfour’s former pupil Walter Heape, Sedgwick set about building on Balfour’s work. He published on Peripatus and encouraged the work of other students, continuing to publish Studies from the morphological laboratory. Heape helped as demonstrator before deciding that he was not suited to teaching and leaving to help set up the Marine Biological Laboratory at Plymouth. Their subject initially seemed to succeed but by the 1890s was on the wane and in 1907 Sedgwick was appointed Professor of Zoology, part of a series of appointments that saw a move away from morphology.
Eco-Anthropology: a fertile hybrid? An epistemological approach to an evolving transdisciplinary field.
Ethnozoology and Ethnobotany are ancient disciplines, which represent well characterized hybrids, focussing on traditional knowledge and practices of local societies related to their fauna and flora. More recently, an increasing interest has been given to relationships between societies and the more or less transformed ecosystems on which they depend. Actually, this new hybrid between Ecology and Human sciences is not clearly defined: terms as “Ethnobiology”, Ethnoecology” or “Eco-Anthropology” reveal the variability of viewpoints within the concerned scientific communities. This hybrid results from interdisciplinary efforts of ecologists and anthropologists to analyse the dynamics of local systems which include humans interacting with their bio-physical environments. These new objects are called “ecocomplexes”, “anthropo-ecosystems” or “anthroposystems”. The paper discuss, from an epistemological point of view, the construction of these concepts and their methodological consequences.
Animal morphology was both helped and hindered by its connections with the developing medical school in Cambridge. In 1884, the physiologist Michael Foster’s campaigning resulted in the course in Elementary Biology becoming compulsory for medical students. Foster had taught this course since its inception in the early 1870s, but he handed the work over to Sedgwick and one of the botanists. This proved key to the school for Sedgwick’s time came to be occupied with teaching hundreds of medical students. Animal morphology seemed to wane as a subject for advanced research, and became a component of this compulsory course for medical undergraduates.
Marion Blute
Session: IX.9 Room: Peter Chalk 2.5 Time: Saturday 11:00-12:30
University of Toronto at Mississauga, Mississauga, ON, Canada
What Are The Prospects For A Biological Theory of Everything?
Theoretical physicists, only half-jokingly, speak of seeking a “theory of everything” by which they mean a unified theory of the four fundamental kinds of interactions or forces in nature - gravity, the strong and weak nuclear forces, and electromagnetism. Traditionally, the genetical theory of evolution (population genetics) has been presented as having already achieved such a unification in biology - most obviously a unification of evolution and genetics. But in addition, physiology and development are incorporated because genes are viewed as programming them, and ecology is incorporated because what many view as the most important cause of evolutionary change, natural selection, is viewed as a product of ecology. However, in recent years there has been much criticism of this synthesis. Some developmentalists have argued that development is as much responsible for what evolves as evolution is for what develops; proponents of niche
construction have argued that organisms construct their ecological environment as much as the ecological environment structures organisms; and some students of inheritance have argued that inheritance is as much cytoplasmic and epigenetic as it is genetic. All this raises the question of the prospects for a synthesis - for a unified eco-evo-devo-geno theory of the four fundamental biological processes - the ecological, evolutionary, physiological-developmental, and genetical. In a paper forthcoming in Biological Theory I have attempted a simple statement of such a synthesis by building on Val Valen’s famous aphorism that “evolution is the control of development by ecology”. Evolutionary change can be initiated in either of two ways. First, because of extensive phenotypic plasticity, an ecological change against an existing hereditary background can induce an organism to develop differently. Secondly, a hereditary change against an existing ecological background can lead an organism to construct its niche differently. Hence (excluding non- selective causes of evolutionary change such as mutation and drift), the suggested extension of Val Valen’s statement is that “microevolution by natural selection is change in the inductive control of development by ecology and/or in the construction of ecology by development resulting in change in relative success at repetition (whether individual, demographic or replicative i.e. both), and ultimately, therefore, in change in the relative frequencies of hereditary elements in a population.” I expand on this amended definition in this paper, but the latter is mainly intended to stimulate a conversation about getting beyond defense and criticism of the old, and moving towards a new, new synthesis.
doing so it shows, that not so much theory and model serve as the common basis but rather the orientation towards one common research question of high applicability. It offers the necessary reduction to the genetic language and to primarily genetic methods. However complexity and diverging languages emerge as soon as the encoded entities and the genetic effects in the animal or human body are modelled. Thus applying and comparing recent discussions within philosophy of science the paper discusses the role of theory and model in this case in regard to the finding that at this early stage of discipline founding apparently a common laboratory practice means more to ‘disciplining’ then theory and model.
Session: XI.7 Room: Peter Chalk 1.6 Time: Saturday 16:00-17:30
Jean-Sébastien Bolduc1,2
1) Université de Montréal, Montréal, Québec, Canada, 2) Université de Bourgogne, Dijon, France
Session: X.3 Room: Newman D
Time: Saturday 14:00-15:30
Evolution In Light Of Leibniz’s Principle Of The Identity Of Indiscernibles
In this paper I will attempt to demonstrate that Leibniz’s Principle of the identity of indiscernibles, can significantly contribute to enrich the way we conceive the motors of biological evolution. Our thesis unfolds following three steps.
First, I postulate that Leibniz’s Principle is valid at the atomic level and hold that atomic level considerations are relevant for evolutionary thinking. The Principle at this level implies that, independently of space and time localisation, no two atoms are exactly identical. Second, being different, two atoms in close proximity necessarily interact with each other, be it directly or indirectly. Although there are many considerations that might be deduced from that argument, the one on which we focus is the most important one in regard to the biological world: physico-chemical relations. Since any atom can be singled out from the set of atoms with which it interacts, we may thus distinguish an object from its environment. Any such object will take part in a limited set of chemical relations with its environment at any given time. Third, those physico-chemical relations are related to the biological world according to two of their fundamental properties: stability and reproducibility.
I will then argue that those two properties are still relevant for all levels stretching from the atom up to the organism. For the sake of this discussion, I will not consider higher than organismic levels. From there, I will argue that Leibniz’s Principle is both relevant and enlightening for considerations about the living world and its evolution. It is of a deep heuristic value because: 1) every single structure, whatever its degree of
Bettina Bock v. Wuelfingen
Humboldt-Universität zu Berlin, Berlin, Germany
Founding The New Discipline Reproductive Genetics: The Role Of Model, Theory
And Language
Reproductive genetics has emerged as a new discipline in very recent years. It combines disparate fields such as molecular biology, developmental biology, cell biology or gynaecology. Thus, theories and models of fundamental reproductive processes used in these founding disciplines which merge in reproductive genetics are not easy to integrate. Already the balance between model and theory and the concepts of theory itself diverge. Thus, what are theories which demarcate this new discipline? Are there common theories or a common way to model within the new discipline of reproductive genetics? This paper maps the yet new field and its progenitors. In
complexity, is unique; 2) these structures do not relate to each other in a random fashion, even if within a complex biological structure lower level structures are exchanged without much effects on stability; 3) however, a large number of secondary chemical structures are necessary for the stability of structures of higher levels of organisation such as the cell or the whole organism; 4) the stability of complex structures is required in order for replication processes to take place.
As it will be shown, this approach underlines the importance of epigenetic mechanisms in both development and evolution, and it imposes that these processes be considered from a constructivist viewpoint, similar to the one endorsed by Developmental Systems theorists. Finally, throughout the close examination of some organismal forms enjoying a paradigmatic status within evolutionary discussions, we will try to show how the Principle, embodied in the notions of stability and reproducibility, brings to the fore what could be viewed as evolutionary drift.
respect to the particular criteria that are relevant to the question at hand. Identifying those criteria is important both for choosing a model in the first place, and for assessing the strength of inferences from that model (either to the general case, or to the specific target). It also, usefully, forces us to define the question clearly up front, and to make explicit hypotheses about what aspects of biology are likely to be involved: some research demands physiological similarity between model and representandum, while other questions may require homologous genes or a similarly-sized cardiovascular system.
Differences between models-of and models-for arise in many areas, ranging from the kinds of assumptions on which each is based, to the practical questions of rendering a model tractable for experimental work and convincing others of its scientific utility. Issues such as the significance of phylogenetic context matter for both types of models, but for different reasons; and shared epistemological questions about exactly what the model (noun) is supposed to model (verb), and how to assess its ability to do so, can have different answers.
Session: VI.3 Room: Newman D Time: Friday 14:30-16:00
Jessica Bolker
Session: XI.6 Room: Peter Chalk 1.5 Time: Saturday 16:00-17:30
University of New Hampshire, Durham, NH, United States
Michael Bölker, Tareq Syed
Models-of and models-for: two modes of representation in biological research
Biologists use models in two distinct ways, which have not been clearly articulated. A model may be used either as a representative of a larger group (models-of), or as a proxy for a specific target (models-for). Zebrafish serve as a model of vertebrates in developmental biology; rodents are models for humans in research on the potential of stem cells to repair the brain. The distinction between models-of and models-for is important because the criteria for and implications of model choice diverge in interesting and important ways, depending on which role the model is to serve. So, too, do the kinds of conclusions we can legitimately draw from model-based research. The divergence derives in part from the use of the two sorts of models to answer different kinds of questions. Models-of serve as exemplars of a larger group, often a higher taxon, and most often appear in the context of basic research seeking to answer questions about fundamental or widespread biological phenomena or mechanisms. In contrast, we use models-for as proxies when the “target” species we ultimately want to learn about – particularly humans – is inaccessible or difficult to study, whether for ethical or practical reasons. Thus, models-for are commonest in biomedical research.
Morphisto GmbH, Frankfurt, Germany
Genes and Information
After Schrödinger’s introduction of the codescript metaphor into biology, molecular genetic processes are usually modeled using terms and concepts borrowed from information theory and information science. We address the use of these terms critically by considering their specific role for the description of gene expression as a process of transcription and translation. In particular, we pay attention to the transition from DNA to RNA and from RNA to the protein-world. Following a description of both shifts within the context of the “information-channel” metaphor, the homonymy of the term “information” is explicated, considering the genetic and epigenetic conditions of the realization of genetic information. Finally we determine the metaphorical use of information science based metaphors as models, which organize and regulate the structure of biological, biochemical and molecular-genetic experimental systems. We discuss forms of metaphorical speech that appear to be necessary to guide and instruct genetic research.
With both uses of models, the model must match its representandum (what it is supposed to represent) with
Session: XII.9 Room: Peter Chalk 2.5 Time: Sunday 9:00-10:30
Based on the analysis of a few examples of the post- genomics discourse this paper will address two issues: a) the assumed discontinuity with molecular biology and its discourse of cellular regulation; b) the strong homologies between the network discourse in biology and the rise of a “connexionist world of worth” as described in L. Boltanski and E. Chiapello’s book on The New Spirit of Capitalism.
Giovanni Boniolo
Firc Institute of Molecular Oncology, Milano, Italy
Mathematical Models and Biology: A Philosophical Analysis
By considering some typical examples, the epistemological status of the mathematical models used in biology will be analysed and clarified. In particular it will be shown how all the mathematical models proposed in the biological sciences should be considered as phenomenological models. This class of models will be discussed also by comparing it with other classes.
The epistemological analysis will be grounded on the notions of ‘as-if world’ and ‘abstraction’, and it will be argued that any mathematical model can be seen as an ‘as-if’ representation, obtained by a process of abstraction (in the sense of abstrahere ab aliquo), of a part of the biological world.
Time: Friday 11:30-13:00
Session: XI.10 Room: Peter Chalk 2.3 Time: Saturday 16:00-17:30
Christophe Bonneuil1, Jean-Paul Gaudilliere2
1) Centre Koyre, Paris, France, 2) CERMES, Paris, France
Navigating the Post-Fordist DNA: Network, Regulations and Variability in Genomics and Society
Twenty years after the beginning of these Human Genome Project, biology seems to have experienced a radical shift in its metaphors and paradigms. The deciphering of genomes and sequences was initially inscribed in a perspective inspired by the “central dogma” with its vision of the gene as a discrete string of DNA coding for the synthesis of homogeneous, standard proteins along a factory-like production line, itself placed under the control of commanding centers located within the DNA itself. The highly structural and deterministic approach was then deemed the clue of biological specificity as the well as the main target of biomedical and biotechnological innovation. Meanwhile a new frontier has emerged, made of multiple regulatory influences, expression patterns, varying assemblies of genes. Complex networks with no commanding molecule and no particular is viewed as the major source of biological specificity, change and causality.
The future of this network-oriented biology of systems is open. Less uncertain is that it represents a new discourse of life with deep resonances in the contemporary vision of corporate management, innovation, and social regulation.
Frédéric Bouchard
Université de Montreal, Montreal, Canada
Session: V.1 Room: Newman B
What is a symbiotic superorganism and how do you measure its fitness?
The difficulty in individuating colonial organisms forces us to re-examine key concepts in evolutionary theory, notably the concept of fitness. The fate of the participating individuals being intertwined, the question of how to individuate these biological systems and how to measure their reproductive success is obvious. One finds similar difficulties in models of group selection where the debate is not whether group selection could occur, but at what level should one expect the differential reproductive success to be measurable (i.e. should we expect groups to produce offspring-groups, or should we merely expect the numbers of individuals in those groups to increase). This difficulty in individuating the biological entities involved is compounded when the emergent individual (or ‘superorganism’) is not constituted of organisms from a single species but from two or more as is the case in symbiotic communities. As many have argued (e.g. Sapp, 1994, 2003), symbiosis, given its prevalence in nature, has not been given its rightful place in the development of evolutionary theory. Here I will argue that a careful examination of the evolutionary strategies found in many obligate symbiotic associations validates a view of fitness that is not to be understood in reproductive terms but in ecological terms. Complex biological systems as symbioses can have an emergent fitness value but it will be described in terms of differential persistence of the community not in the reproductive success of its members.
Session: V.9 Room: Peter Chalk 2.5 Time: Friday 11:30-13:00
interpreted as highly plausible, and that complete ‘disentangling’ of altruism and morality perhaps is not the best strategy for gaining deeper insight into the origin and nature of both ‘phenotypes’.
Tomislav Bracanovic
University of Zagreb, Centre for Croatian Studies, Zagreb, Croatia
Session: II.8 Room: Peter Chalk 2.6 Time: Thursday 14:30-16:00
Altruism And Morality: Is Disentangling Really Necessary?
In his article ‘Evolutionary Altruism, Psychological Egoism, and Morality: Disentangling the Phenotypes’ (1993) Elliott Sober has argued that sociobiological explanations of morality usually commit some sort of pars pro toto fallacy, by assuming that premises ‘Morality involves altruism’ and ‘Evolutionary theory explains the evolution of altruism’ allow for the conclusion that ‘Evolutionary theory explains morality’. As Sober points out, although morality frequently requires altruism, morality and altruism are separate ‘phenotypes’ and the former should be disentangled from and not explained along the same lines as the latter. He offers two arguments for this kind of reasoning. (1) Rules of morality do not require limitless altruism but may also enjoin certain acts of selfishness. (2) Even when they do prescribe altruistic action, rules of morality are characterized by impersonality and generality and are, unlike altruism, not directed at specific individuals. In the paper I will present two arguments why this ‘disentangling strategy’ is not sufficient to show that morality and altruism deserve separate evolutionary explanations. (1) Although it is true that morality sometimes restricts altruism, it is also true that standard sociobiological models themselves already recognize the fact that there is no such thing as ‘limitless altruism’. For example, both kin directed altruism and reciprocal altruism are constrained by specific ‘coefficient of relatedness’ and by particular ‘cost to donor / benefit to receiver’ ratio respectively. Therefore, since neither our ‘morality’ nor our ‘biology’ supports indiscriminate altruism, it seems that ‘no limitless altruism’ proviso by itself will not help to disentangle the two phenotypes in question. (2) Rejecting the possibility of evolutionary explanation of morality by representing impersonality and generality as the crucial features of morality is a theory-laden maneuver that without adequate justification relies on just one (mainly Kantian) account of morality. Although morality described in Kantian terms relatively effectively evades its reduction to some sort of evolutionarily explained altruism, the question remains open why one should take Kantian account as descriptively more accurate than some of its theoretically influential alternatives (Aristotelian, Hobbesian, Humean or relativistic account). In the final considerations it will be argued that, depending on the particular descriptive conceptualization of morality one chooses, sociobiological explanations of morality can be
Michael Bradie
Bowling Green State University, Bowling Green, OH, United States
Popper’s Dance With Darwin
Karl Popper had a long and ambivalent relationship with evolutionary theory and Darwinism. On the one hand, he told us that he had been an admirer of Darwin from his childhood and that he took Darwinism to be the best available explanation of the development of life on earth. On the other hand, he argued that Darwinism was not a testable theory but was, at best, a valuable metaphysical research programme. He later came to retract that judgment. Nonetheless, his stance toward Darwinism as a scientific theory remained cautious and circumspect. However, he saw a close parallel between his own methodological analysis of the growth of scientific knowledge and Darwinism, and argued that the former threw light on and supported the latter.
I begin with a brief sketch of Popper’s theory of the evolution of the tripartite [World 1, World 2 and World 3] universe. I shall then try to make sense of Popper’s changing views on the status of evolutionary theory. Section 2.1 addresses the question of whether there is a law of evolution. Section 2.2 is a critical assessment of Popper’s one time view that Darwinism and evolutionary theory is a tautology. Section 2.3 reviews the various versions of neo-Darwinism that Popper defended. Section 2.4 is a discussion of Popper’s views on the limitations of Darwinism. Section 2.5 looks at Popper’s contention that Darwinism is a ‘metaphysical research programme.’ Finally, in section 3, I draw some conclusions about the permanent significance of Popper’s evolutionary philosophy.
Session: VIII.2 Room: Newman C Time: Saturday 9:00-10:30
Pierre-Alain Braillard
IHPST-Paris1, Paris, France
Systems Biology And
The Mechanistic Framework
The last decade has seen the rise of a mechanistic perspective in the philosophy of biology. In the same time many biologists have argued for a conceptual change towards more systemic approaches. Whereas
analysis in terms of mechanisms seems to capture well the features of biological sciences, it is questionable that this framework is going fit the emerging field of systems biology.
For some authors, systems biology is fully compatible with the mechanistic framework. However, some approaches do not really fit most definitions of mechanisms given by philosophers. Two kinds of limits can be seen. First, whereas a mechanism is usually something local, some approaches try to explain phenomena at the level of whole systems. Here the word system refers to very large and complex networks of interacting components (e.g. gene regulatory networks) and is used to contrast with classical mechanisms that involve few entities. Second, mechanisms are usually defined as temporal and causal processes. This leaves aside a whole class of approaches that try to understand structural constraints at the level of biological networks.
The first limit might be overcome by the use of computer models that allow studying complex mechanisms with non-linear behavior. It is however not clear if these models are explanatory in the same sense as the classical ones. The second limit seems more serious and if one wants to stick to traditional definitions of mechanism, many explanations in systems biology are clearly not mechanistic. For example, an explanation of a system’s robustness in terms of architectural and topological properties is very different from explaining how interactions between different components can produce a phenomenon.
Definitions can of course always be changed to fit new phenomena but a too broad definition of mechanism could be vacuous. Moreover, there is certainly not a strict demarcation between mechanistic and non- mechanistic explanation in biology, but a continuum between very classical mechanisms with few entities interacting and producing a phenomenon and systemic explanations in which entities and their properties play a minor role compared to architectural and topological properties.
My aim is thus to explore how far can the mechanistic framework account for new approaches in systems biology and how serious its limits are for the philosophy of biology.
Session: XI.1 Room: Newman B Time: Saturday 16:00-17:30
Robert Brandon
Department of Philosophy, Duke University, Durham, United States
Developmental Constraints Reconsidered in the light of the ZFEL
Dan McShea and I are currently synthesizing independent work (his on complexity, mine on variation) that suggests that the standing default condition of all evolutionary system—the Zero Force Evolutionary Law—is an increase in both complexity and variation. Indeed that complexity and variation are one and the same thing just view at different hierarchical levels. The idea of developmental constraints is reconsidered in the light of this new perspective.
Time: Saturday 11:00-12:30
Session: IX.4
Room: Peter Chalk 1.1
Christina Brandt
Max Planck Institute for the History of Science, Berlin, Germany
Victor Jollos ́ research on Paramecium: Pure lines and the concept of “Dauermodification”
This paper analyzes the research of the German zoologist Victor Jollos, who worked at the Kaiser Wilhelm Institute for Biologie (Berlin) on unicellular organisms up to the early 1920s. This research led to the introduction of the concept of “Dauermodification” which was seen as an alternative concept to the concept of mutation. Inspired by the work of Herbert Spencer Jennings, Victor Jollos started to apply the method of pure line breeding to research into unicellular organisms in the early 1910s. It was in this early genetics work on unicellular microorganisms - and its requirements for uniformity of research objects – that clones (or “pure lines”) of Paramecium began to be used as standardized research tools. In order to explore whether environmental action could permanently modify genotypes, Jollos treated Paramecium clones with different agents, for example with arsenic acid in a series of experiments that continued a couple of years. Motivated by the hope to solve questions about the inheritance of acquired characteristics, this research led to an important differentiation that reopened the discussion of cytoplasmatic inheritance among the German geneticist in the 1920s: In 1921, as a result of his Paramecium work, Jollos differentiated two different kinds of transmission phenomena and variations, which he related to two different cellular substances: 1) The transmission of genes and their
variation, which was related to the structure of the cell nuclei. 2) the transmission of changes that were based on modifications of structures in the cytoplasm. Variations belonging to the first group were regarded by Jollos as genotypic variations, or mutations. In contrast to that, Jollos regarded all variations of the second group as modifications or “Dauermodifications”. In the first part of the paper, I will explore the developments of Jollos research in detail and in comparison to other research approaches on Paramecium at that time (that is: mainly to the work of Herbert S. Jennings). The second part will analyze the influence that Jollos concept of “Dauermodification” had on the knowledge of heredity in the 1920s. Especially debates on the relationship between cell nuclear and cytoplasm and the debates on cytoplasmic inheritance will be discussed.
impose an obligation on human agents. I propose that an alternative argument, inspired by a Kantian conception of teleology, may retain claims for the experience of purposiveness in, and the existence of duties to, nature while avoiding the fallacy. I argue, first, for an analogical conception of purposes in nature. Thus, organisms and larger eco-systems are judged teleologically in analogy to the purposive activity of our own reason which strives for the ends it sets itself. In contrast with Jonas, no claim to absolute knowledge of natural purposes is justified. The capacity to reason and to set oneself free ends, however, is on the Kantian account the only candidate for unconditional value. In a second step I argue that if we have to view nature in analogy with our own reason we thereby consider it by analogy to what has unconditional value. By pressing this analogy between reason and nature, I thus explore in how far a weaker teleological account of nature based on analogy may give stronger grounds to viewing nature, again in analogy with our own reason, as intrinsically valuable.
Session: IV.9 Room: Peter Chalk 2.5 Time: Friday 9:30-11:00
Angela Breitenbach
University of Cambridge, Cambridge, United Kingdom
Connections Between Purpose And Value In Nature
This paper is concerned with the relationship between the conception of purposiveness in nature and ethical claims about the value of nature. As concern for our natural environment is growing, purely mechanistic conceptions of nature have been criticised for attempting to explain away the seeming purposiveness of nature and for reducing the value of nature to the way in which humans use it for their ends. This reductionist and scientifically oriented view on nature, so the criticism runs, has made possible an exploitative attitude towards it. Some environmental philosophers, among them Hans Jonas, have argued for an alternative conception which takes the purposive character of nature seriously and uses it as a basis for the claim that nature has intrinsic, and not merely instrumental, value. Duties to nature are binding, Jonas argues for example, because we experience nature as unmistakably purposive and because such purposiveness is a sign of value in nature.
I claim that the Jonasian environmental philosopher has a valid ambition yet his argument does not go through. Jonas is right to argue for the intuitive validity of the claim that nature has a value over and above its usefulness to human beings and that this value has some connection with the seemingly purposive character of nature. Yet, Jonas’s inference from the experience of purposiveness to the existence of intrinsic values commits a naturalistic fallacy. It slides from a descriptive notion of purpose experienced in nature to a prescriptive notion of purpose considered to
Session: VI.5 Room: Peter Chalk 1.3 Time: Friday 14:30-16:00
Ingo Brigandt
University of Alberta, Edmonton, Canada
Typology Now: Homology and Developmental Constraints Explain Evolvability
By tying homology and morphological organization to evolvability, the paper presents a notion of homology that attempts to 1) bridge the gap between developmental and phylogenetic approaches to homology and to 2) show a sense in which developmental constraints and natural selection are compatible and in fact complementary.
Motivated by Wagner’s work on the character concept, I conceive of a homologue as a unit of morphological evolvability, i.e., as a part of an organism that can generate heritable phenotypic variation independently of the variation generated by the organism’s other homologues. An account of homology therefore consists in explaining how an organism’s developmental constitution results in different homologues/characters as different dimensions of phenotypic evolvability. The explanans of an account of homology is developmental, yet the very explanandum is an evolutionary phenomenon: evolvability on a character-by-character basis, which manifests itself in phylogenetic patterns as recognized by phylogenetic approaches to homology.
While developmental constraints and natural selection have often been viewed as antagonistic, I argue that both are complementary as they concern different stages in the evolutionary process. Developmental
constraints, conceived of as the presence of the same set of homologues across phenotypic change, pertain to how heritable variation can be generated in the first place (evolvability), while natural selection operates subsequently on the produced variation. Finally, I use the fact that homologues exist on different levels of organization to point to open empirical challenges for evo-devo, and to emphasize the fact that an account of evolvability must address entities on several levels of organization and their interaction.
differ in important attributes, such as duration, direction in space and velocity. Raising an arm slowly or quickly is still raising an arm. What all individual processes of raising an arm have in common is a pattern in the aggregation of their individual sub-processes. One and the same pattern can be exemplified by two different individual processes even if they always have different individual sub-processes. Speciation is a process just like raising an arm and a species is the result of speciation just as an elevated arm is the result of raising an arm. We do not require a prior conception of species to account for speciation. For, there was no species before speciation. What defines a species is its specific place in the succession of speciations, its place in the phylogentic tree. Thus, taxonomy no longer is an order of perdurant universal, but represents an order of process universals, which are patterns in processes and feature as essential elements in taxonomy.
Session: XIII.4 Room: Peter Chalk 1.1 Time: Sunday 11:00-12:30
Mathias Brochhausen1,2,
Ulf Schwarz1,3
1) Institute of Formal Ontology and Medical Information Science, Saarbrücken, Saarland, Germany, 2) European Center for Ontological Research, Saarbrücken, Saarland, Germany, 3) Philosophy Department, Saarland University, Saarbrücken, Saarland, Germany
Species Essentialism Without Attributes – Processes, Patterns and Biological Ontologies
We propose a species essentialism which is based on ontological considerations regarding biological processes. Species are bearers in biological processes. One of these processes is such that a species which undergoes this process ceases to exist and two new species occur; this process is called speciation. In our view process universals, e.g. speciation, are patterns in aggregations of processes. Thus, what a species is can only be explicated through a prior definition of speciation, and not vice versa, i.e. the existence of a species depends on the existence of speciation. Since this view on the taxonomy and underlying ontology of species takes the process of speciation as central, we have to provide an ontological analysis of (biological) processes in general, and speciation in particular. We argue that a process-type is a specific pattern of underlying, more basic processes (sub-processes). Atomic processes constitute the end of this reduction. These are processes on the sub-molecular level, usually investigated by physics or chemistry. We clearly acknowledge that reducing processes to the very end might not be conceivable at present for all complex, high level processes today, but this is only due to the current state of the art in natural sciences.
When we take a process of raising an arm as an example, we show that we cannot capture the sub- processes of the process by merely dividing the whole motion inch for inch (like a camera would do). Rather, when we examine two individual cases of raising an arm, a common pattern between these to individual processes can be found, even if both individual processes
Session: XII.8 Room: Peter Chalk 2.6 Time: Sunday 9:00-10:30
Rachel Bryant
University of Toronto, Toronto, Ontario, Canada
“Invasive” Species And
The Diversity-Stability Hypothesis
The diversity-stability hypothesis has been a keystone of community ecology since the young discipline began to thrive in the mid-twentieth century. Roughly, the hypothesis states that the stability of ecological communities correlates positively with their diversity. Current debates about the hypothesis focus on the definitions of diversity and stability. Yet all sides have this in common: an assumption that the putative correlation is driven by diversity.
Contemporary incarnations of the diversity-stability hypothesis attend solely to the effects of diversity on stability, neglecting the effects of stability on diversity. However, the earliest versions of the hypothesis took them seriously. For instance, in his watershed Ecology of Invasions by Animals and Plants, (1958) Charles Elton relied on both the effects of diversity on stability and the effects of stability on diversity to explain ecological communities’ differential susceptibility to so- called invasive species. In this paper, I revisit Elton’s view and suggest a novel way in which invasions demonstrate the bi-directionality of diversity-stability relationships.
No matter which of the many definitions of diversity we choose, the invasion of an ecological community by a new population constitutes an increase in that community’s diversity. This is puzzling if we assume that diversity-stability relationships are positive and unidirectional. Such an assumption would lead us to
conclude that invasions heighten communities’ stability. In fact, sometimes invasions have little effect, and sometimes they precipitate future reductions in both stability and diversity. I suggest that whether or not an invasions’ initial boost to diversity affects a community’s future diversity depends in part on the stability of the community at the time of invasion. Thus, I suggest that a community’s stability influences its diversity.
Time: Friday 16:30-18:00
Institute for History and Philosophy of Science and Technology, Toronto, Ontario, Canada
From Weird Wonders to Stem Lineages: The Second Reclassification of the Burgess Shale Fauna
The Burgess Shale is a fossil bed containing the exquisitely preserved remains of marine invertebrate organisms from shortly after the Cambrian explosion, a time when most or all of the known major animal groups (phyla) rapidly evolved. The Burgess Shale was discovered in 1909, but it was not brought to widespread scientific and public attention until 1989, with the publication of Stephen Jay Gould’s bestseller Wonderful Life: The Burgess Shale and the Nature of History. Gould dubbed the bizarre Burgess forms ‘weird wonders,’ and proposed twenty new classes of arthropods and fifteen to twenty new phyla of animals to contain them. He interpreted the Burgess creatures as unique, unrepeatable experiments in the early history of evolution.
the other marked by the contemporary interpretation. Both lend themselves well to a Kuhnian analysis. The definitions of key terminology are incommensurate between these two periods, and each period included its own unique set of problem-solving exemplars. Through changing definitions of key concepts like ‘phylum,’ through the introduction of new terms such as ‘stem group,’ and through changes in systematic methodology from evolutionary systematics to cladistics, a completely new understanding of the Burgess Shale creatures, their place in the classification of life relative to modern organisms, and the significance of their existence and ours, has been forged.
Session: VII.8
Room: Peter Chalk 2.6
Keynyn Brysse
On this interpretation, the Burgess Shale organisms display more than twice the taxonomic disparity seen in modern animals. Gould used this interpretation to argue that the diversity of life has decreased over time, not increased as is commonly believed, and that our modern world is populated not by the winners of a Darwinian struggle for existence, but only by the descendants of a random handful of survivors of a sudden (and mysterious) Cambrian decimation.
A sword from the field of battle: The double helix and the secret of life in 1950s Britain
This paper will argue that the double helix model built by Crick and Watson in 1953 can be treated as a relic of an intense battle over the nature of life. While debate between vitalists and antivitalism was a century old. It was however particularly urgent in England in the early post-WW2 years.
One the one side were materialist reductionists, who took a ‘modernising’ side in this strongly fought cultural debate. The biotheoretical views of British biochemists before World War 2 has been much discussed, as have such leading figures as J.B.S. Haldane and J. D. Bernal. The paper will argue that Francis Crick can be seen as a younger member of the community of reductionist materialists for whom life was matter. Although lacking the Marxist political commitment of the older generation, Crick himself was well-known as an antireligious materialist and was happy to describe Bernal as his ‘scientific grandfather’. If the solution he and Watson found was in DNA and not in protein, that was a development not a denial of Bernal’s teaching. In his polemical lectures Of Molecules and Men (1966) he described living beings as machines in very much the same terms as Bernal twenty years earlier and portrayed a conflict between science and mystics. Famously Crick ran into the Eagle pub in Cambridge with the declaration that he solved the secret of life. In Watson’s Double Helix this is taken as youthful exuberance. This paper will argue that instead it should be seen as a serious war-cry in the ongoing conflict.
On the other side was a strong anti-reductionist romantic movement represented by the writers C. S. Lewis and J. R. R. Tolkien. They met regularly with friends, as the ‘Inklings’, at an Oxford pub, which,
It is this story, Gould’s story, that people have heard, if they have heard of the Burgess Shale at all. In the twenty years since Gould wrote this book, paleontologists and systematists have given a very different interpretation to the diversity of Cambrian form, its taxonomic significance, and its significance for our understanding of the nature of diversity and the operation of evolution. Burgess Shale studies have entered a new phase.
There are, then, two historical shifts in Burgess Shale research, one marked by Gould’s ‘weird wonders’ and
Session: XI.3 Room: Newman D Time: Saturday 16:00-17:30
Robert Bud
The Science Museum, London, United Kingdom
ironically, was called the Eagle and Child. Their books written at this time have sold hundreds of millions of copies, and it is said that the Lord of the Rings is the world’s favourite book. There is a rich interpretative literature on the deeply antagonistic attitudes to scientism of this group, and their wish to maintain the mystery of life. Lewis himself had written a cycle of science fantasies, which pitted good against evil men corrupted by excessive zeal for corrupt science. One of Lewis’s central characters, a scientist named Weston, has been clearly identified as based on Haldane and Bernal. Bernal’s The World, the Flesh and the Devil was certainly seen as a key provocation to the writers.
experimental evidence supporting some of the new concepts will be discussed as well as their relevance for the solution of the antinomies between chance and necessity, discreteness and continuity, additive and interactive behaviour, genetic and environmental determination of phenotypes. Data will be presented on the development throughout evolution in prokaryotes, eukaryotes and humans of different genetic and phenotypic “variability generators” and of the recognition processes needed to “choose” in different spatial and temporal contexts, the portion of “benevolent disorder” to be used for adaptation. The different adaptation strategies will be then discussed on the basis of existing evidence showing a prevalence of genetically based mechanisms in bacteria, phenotypic plasticity tools in plants and animals, neural plasticity in humans. Some tentative conclusions will be finally drawn on the implications of the new synthetic concept of “multiversity” of living systems for the dynamics of the interaction between the “Zeitgeist” ( spirit of times) and scientific communities, relevant for the choices of the “observation point” from which, during last two centuries, research has been looking at biological systems.
This paper will therefore explore the meaning of the claims made for DNA in 1953 in terms of the conflict between these two opposing, and each hugely influential, communities.
Time: Friday 9:30-11:00
Session: IV.4 Room: Peter Chalk 1.1
Marcello Buiatti
University of Florence, Florence, Tuscany, Italy
The “Benevolent Disorder” and Recognition Processes as Conditions for the Different Adaptation Strategies of Prokaryotes, Eukaryotes and Humans
Charles Darwin, particularly in “The variation of animals and plants under domestication”, thoroughly discussed the origin and nature of variation. According to Darwin, variation is a condition for evolution, may be inherited and directly or indirectly induced by environment, follows the law of correlation. In the XXth century, on the contrary, the introduction in Biology of the powerful reductionist method of analysis and the use and misuse of “Occam’s razor”, led to the prevalence of a deterministic approach within the frame of which, variation was considered as completely random noise, only a small number of useful mutations being maintained through selection. A physicist, Gregor Mendel was the first to introduce the concepts of discreteness of genetic elements, their random distribution in subsequent generations, the complete determination of the phenotype by genes. These concepts seemed to be wholly confirmed by evidence coming from Molecular Biology, leading to the formulation of the “Central Dogma” by Francis Crick and the definition of DNA as the “fundamental invariant” of living systems by Jacques Monod.
In the last 5-10 years, the surprising fast improvements in genome structure and function analytical tools, as well as progress in fields such as developmental genetics, neurosciences, ecosystems and evolutionary dynamics are completely changing our views of life. In the present talk
Session: III.4 Room: Peter Chalk 1.1 Time: Thursday 16:30-18:00
Michael Bürgi
Institut d’histoire de la médecine et de la santé, Genève, Switzerland
From Organic Chemistry to Molecular Biology – Practical, Institutional and Strategic Shifts in Drug Development at Hoffmann-La Roche, 1960–1980
By the early 1960s, the Swiss based pharmaceutical company F. Hoffmann-La Roche & Co. (Roche) was one of the world’s largest manufacturers of synthetic vitamins and drugs. The synthetic benzodiazepine tranquilizers Librium and Valium, developed in the Roche research departments and launched by the company in 1960 and 1963, further strengthened its position as a leading chemistry-based company. Forty years later, Roche had become, according to business historian Alfred D. Chandler, a leader in genetic engineering and had played a significant role in building the infrastructure of the emerging biotechnology revolution.
This paper addresses the question of how and when Roche started transforming its long-standing research and manufacturing capacities based on organic chemistry into a significant strength in biotechnology. By focussing on the creation of the Roche Institute of Molecular Biology (RIMB), founded in 1967, I will argue that, since the mid-1960s, profound changes took place
at Roche. These changes became decisive when the company embarked on reinforcing its biological research capacities and establishing new relationships with the academic molecular biology community.
processes, the organization of our understanding of these objects of intense study is much more ‘contingent’, much more dependent on their history, much more dependent on the roles they play in various larger systems. This heightens the systematic importance of exploratory experimentation not only at present, but (I suggest) as a major means for the indefinite future in understanding the entities studied in molecular biology, genomics, evolutionary developmental biology (evo-devo) and numerous related fields. I will reinforce these claims by reference to a few key examples, probably including recent work on heat shock protein 90 (hsp90), which also illustrates the striking changes that will be wrought by recent molecular work along the lines studied in all three papers in this session.
The changes I will outline in my paper took place on several levels of corporate research, and affected practical, institutional and strategic aspects of drug development and drug manufacturing. On the strategic level, increasing dissatisfaction with the traditional strategies of drug development gave rise to discussions within the company’s research management on whether Roche should devote more effort to biological research. Institutionally, these discussions were followed by the creation of the RIMB, a relatively independent research centre. On the practical level, Roche research increasingly focused on molecular targets rather than chemical bullets. Furthermore, the RIMB was granted academic research conditions that the company had not been familiar with before.
In my paper, I will also try to outline some answers to the historically important question of why these changes were beginning to take place in the 1960s, and how they might have contributed, in the long run, to Roche’s present-day position as a leading actor in biotechnology.
Time: Thursday 14:30-16:00
Exploratory Experimentation in Recent Molecular Biology and Genomics
In this paper I argue that we are not far from the beginning, perhaps in the middle, of a phase of exploratory natural history of genetic elements, the pathways by which gene products are produced, and the zoo-full of regulatory elements affecting expression and readout of genes and related processes. I argue that this zoo is not only much larger than the “fundamental particle zoo” that particle physics explored around a half century ago, but also (very likely) of a significantly different character. Unlike the fundamental particles of physics, there are no strong physical, chemical, or biological principles that can organize a general classification of the molecular entities and processes of interest. Put differently, the relevant distinctions among molecular entities and processes are not likely to rest on fundamental principles. I argue that this is likely to be a fundamental fact about the natural history of the relevant entities and processes because they are ‘historical’ entities and processes. Thus, even though they are, at bottom, molecular and chemical
Session: VI.8 Room: Peter Chalk 2.6 Time: Friday 14:30-16:00
Session: II.3 Room: Newman D
Extinction in German Natural History, 1790-1830
Rather than add another volume to studies of “evolution” and “biology,” I propose to track the emergence and development of the concept of “extinction.” At the center of my project are a core group of closely interacting figures, composed of two illustrious teachers: Johann Blumenbach (1752-1840) and Abraham Werner (1749-1817), and four of their students: Ernst Schlotheim (1764-1832), Karl von Hoff (1771-1837), Leopold von Buch (1774-1853), and Alexander von Humboldt (1769-1859). My contribution to the field will be to demonstrate how the notion of “extinction” is critical to developing a more robust sense of the earth’s history, one in which plants, animals, humans, and Providence play unpredictable, interconnected roles.
[Note: The above paragraph is an abstract from my dissertation prospectus. What I would like to do in Exeter is to briefly present the major features and arguments of my project in order to elicit feedback from the various philosophers, biologists, and historians in attendance. I want my dissertation to reflect the interests of a broader audience than just the historians in my department at Berkeley.]
Dick Burian
Virginia Tech University, Blacksburg, VA, United States
Thomas Burnett
University of California, Berkeley, Berkeley, CA, United States
Session: VIII.5 Room: Peter Chalk 1.3 Time: Saturday 9:00-10:30
were rooted in a modern sense of sound experimental epistemology. The late 1920s and 1930s saw the pace of developments in isolation and divergence studies quicken over an ever-expanding range of specialities. Technical innovations, new conceptualisations, and discovery combined to create a series of excited frenzies. Curiously, these developments are overlooked by founding narratives of revival. But in the 1930s, it’s this emphasis on processes causing isolation and divergence that gave researchers an increasing sense of mastery over the origin of species and an increasing sense of confidence that modern research methods brought such questions within their reach. As the 1940s approached, discussion frequently centred on matters of extension, relevance, and overlap. These discussions fuelled calls to increase respect for “speciation studies” and encouraged the expansion of “borderland” or “synthesis” research by the late 1930s.
Jason Byron
University of Pittsburgh, Pittsburgh, PA, United States
Holistic Medicine and the Rise of Sexology in the Weimar Republic
Sexology emerged as a field of medicine during the first half of the 20th century. In 1913, three professional societies were formed for the scientific study of sex, two in Berlin and one in London. The following year, the first sexology journal began publication, and in 1919 Magnus Hirschfeld opened the first institute for sex research. Between 1921 and 1932, seven international conferences were convened across Europe. By 1933, little trace of the thriving field remained. Hirschfeld was dead, his associates sent to death camps, and his institute burned. Examining the rise and precipitous decline of German sexology reveals much about medical practice in the Weimar Republic. During this period, experimental physiology and public health began institutionally separating themselves from clinical medicine. While physiology explicitly adopted mechanistic experimental protocols from physics starting in the late nineteenth century (from Hermann Helmholtz, for example), public health allied itself with the Frankfort Institute for Social Research and its more dialectical approach. Sexology found itself within the interstices of this yawning divide — bridging eugenics, pathology, epidemiology, and social anthropology while at the same time appropriating the moral authority of traditional clinical medicine. In my talk, I argue that it was precisely the ‘holistic gaze’ of German physicians, that is, their non-mechanistic modes of understanding, that gave early sex researchers their epistemic, moral, and political authority and that presented a growing threat to the agendas of National Socialism.
Session: XII.3 Room: Newman D Time: Sunday 9:00-10:30
Joe Cain
Werner Callebaut1,2
University College London, London, United Kingdom
1) Konrad Lorenz Institute for Evolution and Cognition Research, Altenberg, Austria, 2) Hasselt University, Diepenbeek, Belgium
An epistemic community glued together. Evolutionary studies in the 1930s
In tales of founding the evolutionary synthesis, the recovery of natural selection and Darwinism takes centre stage. But this is a myth - a tactically and strategically useful fabrication. This paper examines what’s obscured by this founding myth. Processes of isolation and divergence were major features of many research programmes in botanical, zoological and biological settings during the 1930s. These interests had roots in the work of previous generations. They also
From Systems Biology to Evo-Devo and Back
Systems biology has been hailed as dealing with “the ultimate many body problem of living matter” (Thorpe and Carlsson 2004) and as putting biology on a solid physical footing—“a consistent framework of knowledge based on fundamental laws of physics,” as Kitano (2001) put it. Yet, the “physicalization” of systems biology threatens the application of a set of concepts related to functionality that seem indispensable for biological
Brett Calcott
Australian National University, Canberra, ACT, Australia
Two Ways That Modules Enable Evolvability
Modularity is a term with many, sometimes subtly different meanings. In this paper I examine the connection between modularity and evolvability. I distinguish between modularity as “quasi- independence” (modules are good as they enable independent change) and modularity as “re-usability” (modules are good because useful functionality can be redeployed). To what extent do these two possible advantages of modularity interact? I connect these issues to some other putative properties of evolvability: Kirschner and Gerhart’s notion of “weak-linkage”.
Session: IV.11 Room: Peter Chalk 2.1 Time: Friday 9:30-11:00
Session: V.11 Room: Peter Chalk 2.1 Time: Friday 11:30-13:00
explanation (Krohs and Callebaut 2007). The talk will focus on “global,” nonreductive systems biology (Nicholson et al. 2004) to ask “where is the actual biology in ‘systems biology’?” (Huang 2004), and discuss how evo-devo could contribute (although this may seem paradoxical at first sight; Baguna and Garcia- Fernáández 2003) to less internalist approaches to systems biology.
Session: IX.4 Room: Peter Chalk 1.1 Time: Saturday 11:00-12:30
Session: IV.1 Room: Newman B Time: Friday 9:30-11:00
“Complex Recombinations”: Rethinking the Death of de Vries’ Mutation Theory
In this paper I propose to reinterpret the received view of the reception and rejection of de Vries’ epochal “mutation theory,” in favor of a less Whiggish account paying attention to the important role of shifts in the meaning of mutation during this period. The received view of the fall of de Vries’ mutation theory holds that careful cytological work—conducted most notably first by Bradley Davis in 1910-1912, and then “confirmed” and developed from 1923 to 1950 by the painstaking work of Ralph Cleland—disproved de Vries’ theory by demonstrating that the purported “mutations” or new species de Vries thought he had discovered were in fact little more than “complex recombinations” (Garland Allen). Clear cytological evidence alone, according to the received view, when properly interpreted can account for the fall in popularity and ultimate rejection of de Vries’ theory. From being the prime example of a plant in the midst of a mutating period to being recast as an aberration with an exceptional number of karyokinetic idiosyncrasies, Oenothera rapidly came to stand as a shorthand for the dangers of paying too much attention to the wrong kind of organism when developing a general theory. Recent work highlighting important shifts in the meaning of mutation in the first three decades of the twentieth century, however, suggests that a more complex view of the rejection of de Vries’ theory may be at hand. In this paper, I explore this possibility and propose that competing definitions of mutation operative at the time, largely correlating with disciplinary divides between zoologists and botanists, combined in “complex recombinations” with ever- increasing clarifications of distinct levels of variation and the runaway success of the drosophilists to help cement the meaning of “mutation” as essentially genic in nature. (This was an association that was even further strengthened by the unceasing efforts of H. J. Muller.) This excision of the karyotypic from the realm of mutation was therefore not only an important and heretofore relatively unmentioned shift in the meaning of mutation during this period, as I have described elsewhere, but even contributed to the downfall of de Vries’ theory, one of the most important and inspirational theories of evolution of the time. Not even the later discoveries of Albert Blakeslee and others, that such karyokinetic phenomena are in fact among the normal modes of variation and speciation for many types of plants, could save de Vries’ theory from the complex recombinations of such powerful semantic mutations.
Jane Calvert1, Joan Fujimura2
1) University of Exeter, Exeter, United Kingdom, 2) University of Wisconsin, Madison, United States
Systems biology: the revolution after the revolution?
It is helpful to examine the genomics revolution in the context of another alleged revolution: the move to ‘systems biology’, the self-proclaimed successor science to genomics. This paper will examine systems biology in the context of the language of “the genomic revolution” and the “post-genomic” aftermath. Our aim is to point to the continuities and discontinuities in problems, concepts and experimental practices and to examine the aims of productions of ruptures in the history of biology. Systems biologists differ on what is innovative in the content and methods (such as new technologies, computation and conceptual frameworks) of their field. However, they all appear to agree that their field differs from genomics in its emphasis on integration: the integration of different types of data, different levels of analysis, and different disciplinary approaches. The aim of this integration is to build complex models, which requires sophisticated mathematical skills. Systems biology also involves the study of emergent properties, and is sometimes accompanied by grand epistemic ambitions and evangelical zeal.
Some systems biologists (as well as some historians and sociologists of biology) engage in the production of “genomics” as a reductionist science that did not deliver the cures for diseases or provide the depth of understanding that were promised.
Many systems biologists discuss and debate about whether what they do is new or not. We examine these discussions and analyze the production of a postgenomic systems biology as part of the production of a history of biology with an emphasis on why continuity is stressed in some circumstances and discontinuity in others. This paper will address these issues by drawing on fieldwork conducted in systems biology laboratories in the US and the UK.
Luis Campos
Drew University, Madison, NJ, United States
Session: VIII.6 Room: Peter Chalk 1.5 Time: Saturday 9:00-10:30
Session: VI.11 Room: Peter Chalk 2.1 Time: Friday 14:30-16:00
Havi Carel
Francisco Carrapiço
University of the West of England, Bristol, United Kingdom
Unifying Phenomenological and Biological Descriptions of Disease
Much philosophical work has been dedicated in the past thirty years to exploring the concepts of health and disease. A prominent account of disease is the naturalistic, or objectivist approach of Christopher Boorse and his supporters, who see disease as a value- free objective concept. On this view, disease is a biological dysfunction of a body part or system. In this paper I argue against a purely biological understanding of disease and suggest augmenting the biological account with a phenomenological perspective.
I argue that disease cannot be captured solely through a biological account. This is because disease (or disability) redefines the relationship of the person to her world, and moreover transforms this world by altering and limiting it. As embodied persons we experience illness primarily as a disruption of lived body rather than as a dysfunction of biological body (Toombs 1988). But biology, and more specifically medicine, have traditionally focused on returning the biological body to normal functioning, and have therefore worked from within a problem-focused, deficit perspective that ignores the lived body. Within this approach, the experience of the ill person is measured in negative objective parameters, i.e. how ill or impaired she is, while the lived experience of illness, which varies tremendously from one person to another, is overlooked (Lindsey 1996).
In this paper I claim that there is a need to unify the biological-naturalistic and phenomenological models of disease. I show the problems in the naturalistic approach and suggest augmenting it with a phenomenological model. A phenomenological approach, I claim, can provide a framework for incorporating the experience of illness into the medical naturalistic account, by providing a rich description of the altered relationship of the ill person to her world. Similarly, qualitative data analysis methods can be applied to naturalise the subjective accounts of ill people. This synthesis demonstrates a more general point, namely, how phenomenology can enter into a fruitful dialogue with biology.
University of Lisbon, Secretary of the International Symbiosis Society, Lisbon, Portugal
From Symbiosis to Symbiome: an epistemological approach
Our world is a symbiotic world, and one of the main characteristics of the biological systems is to establish associations and connections with other organisms. This manifestation is one of the main life characteristics and its diversity. In a way, life has not established itself or developed to exist alone. Since the introduction of the symbiosis concept by Anton de Bary in 1878 as “the living together of unlike named organisms” and its new theoretical formulation – symbiogenesis – by Constantin Merezhkowsky, in 1909, as “the origin of organisms by the combination or by the association of two or several beings which enter into symbiosis”, that this field of science has been a place of controversy and discussion. The symbiogenesis concept was a landmark for the development of further studies on biology and evolution, even if it was not well understood at the time it was formulated, nor received proper attention from the scientific community. Throughout the twentieth century, biologists have generally considered symbiosis as a curiosity, a rare exceptional phenomenon and nothing more than a residual aspect of the evolution problem. Its study fell largely outside the conceptual and technical framework of biology, and namely of the neo-Darwinism.
Evolution - a complementary process of divergence and integration - is considered the core organizing concept and the structural pillar of modern biology. Symbiosis, the physiological and/or genetic integration of taxa, is recognized today as lying at the basis of macro- evolutionary changes. It is held to have played a central role in the evolution of eukaryotes, the origin of land plants, and a myriad of adaptive evolutionary innova- tions. It is at the basis of important ecosystems, from deep-sea vents to the rainforests and coral reefs. From a symbiotic perspective, every plant and animal is a superorganism, a symbiome, comprised of chromosomal genes, organellar genes, and, often, of bacterial symbionts as well as viruses. This concept extends the limits of the multicellular organism beyond the activities of its “own” cells. The development of a Symbiogenic Theory of Evolution could contribute towards a new epistemological approach of the symbiotic phenomenon in the evolutive context. This, in our point of view, could be the beginning of a new paradigm in science that rests almost unexplored.
The recent creation of Symbiomics by Jan Sapp (2003) as a new term to characterize the field of knowledge on symbiotic studies is also an important contribution to
unify and consolidate these studies. This input will benefit not only the scientific level, but will also improve the adequate tools for teaching this science in schools and universities. This contribution could also be seen from a more cooperative-learning perspective, using different live and virtual symbiotic systems to motivate the scientific and education community for this new paradigm, towards the scientific knowledge in the beggining of the XXI century.
This framework allows us to posit much more powerful forms of genuine altruism, as opposed to the rather weak ones made possible by other accounts that have attempted to directly show how altruistic behaviour is an ESS.
Time: Friday 16:30-18:00
Session: VII.1
Room: Newman B
Eric Charmetant
Centre Sèvres, Facultés jésuites de Paris, Paris, France
Session: V.9 Room: Peter Chalk 2.5 Time: Friday 11:30-13:00
Vasco Castela
Towards Analogues of Ordinary Morality in Apes
Since its revival in 1975, debates about the contribution of evolutionary ethics have concentrated mostly on realism/antirealism issues and to a lesser extent, on normative ethics, trapping evolutionary ethics into the frame of philosophical ethics. As the syntactical and semantic dimensions of human language are needed for philosophical ethics, it seemed very unlikely to find an extension of morality outside the human species. Attempts at speaking about the moral dimension of apes, made by Edward O. Wilson (1978) or Frans de Waal (1996), were often disqualified as oversimplified or anthropomorphic.
This crude elimination of animal morality comes mainly from an ignorance of the ontogenetic dimension of morality and of the fact that very sophisticated moral theories (such as those developed in contemporary analytical moral philosophy) could underestimate ordinary morality as the core of the moral phenomenon. Contemporary moral psychology, launched by the pioneer work of Jean Piaget and developed in different ways by Lawrence Kohlberg and Martin Hoffman, is a good basis to identify the main characteristics of ordinary morality: centrality of reciprocity, ability to follow shared rules, ability to handle exceptions, and ability to regulate the state of relations in the group.
Through the current attempts in primatology to identify and interpret such phenomena mainly in great apes, we propose, after the insights of Konrad Lorenz (1956) and Hans Kummer (1980), to explore anew the possibility of analogues of ordinary human morality in other living species. Through the conflicts in interpretations among primatologists and psychologists concerning the areas of consolation, reconciliation, sense of justice, obedience to implicit shared rules, perception of others’ intentions by apes, we present the outlines of the analogues of human ordinary morality in apes, their similarities and differences to our own.
University of Manchester, Manchester, United Kingdom
Virtuous Behaviour Need Not Be an Evolutionary Stable Strategy
According to traditional accounts of evolutionary theory, a trait can only be an ESS if it increases the fitness of the individual. As a consequence of this feature of evolution, any virtuous behaviour that involves some form of altruism seems impossible more or less by definition. Common approaches to solving this problem, such as in the work of Axelrod, Trivers, Danielson or Sober & Wilson have tried to explain the existence of altruism by claiming that behaving altruistically can in fact be an ESS or, as in the approach of Robert Frank, that being an altruist is evolutionary advantageous, on average. I wish to claim that altruistic behaviour can be seen as a potential capability, rather that a trait. And so, as it happens with any capability that does not usually express itself, it may remain hidden from evolutionary pressure. For instance, being capable to learn how to direct a film or design a web site are not things that have been submitted to evolutionary scrutiny. We should not expect them to be evolution-arily advantageous, being aware that evolution has obviously not worked out every possible use, in a modern society, of traits that were designed to be evolutionarily advantageous.
For Aristotle, the acquisition of virtues required for virtuous (including altruistic) behaviour is only possible given an adequate moral education. Such moral education, however, can only occur in a carefully engineered social environment, with complex customs and laws. If we accept that such environments have not been common in our evolutionary history, all capabilities that may only be acquired within it can be taken not to have been shaped directly by evolutionary pressures. The evolutionary scenario I suggest is one in which evolutionarily advantageous emotions, ordinarily used to help kin and close friends (with the potential beneficial effect of reciprocity), could be trained in very specific artificial (uncommon) environments to extend to strangers, allowing for an extended altruism.
Session: VI.8 Room: Peter Chalk 2.6 Time: Friday 14:30-16:00
Session: III.8 Room: Peter Chalk 2.6 Time: Thursday 16:30-18:00
Céline Cherici
Tobias Cheung
Paris 7-REHSEIS, PARIS, France
Humboldt-University, Berlin, Germany
Félix Vicq d’Azyr’s understanding of human cerebral structures and contribution to the field of brain anatomy in the late eighteenth century, in France
Vicq d’Azyr is an important french anatomist who had a main influence over comparative anatomy, brain and cerebellum’s anatomy and medical developpments in France. He was professor of anatomy to the veterinary school situated to Alfort.
In his Traité d’anatomie et de physiologie avec des planches coloriées représentant au naturel les divers organes de l’Homme et des Animaux (1786), and the article “anatomie pathologique” in the Encyclopédie méthodique, Felix Vicq d’Azyr refers to and synthesizes many contemporary observations on human brain and cerebellum.
My aim will be to examine the ways in which Vicq d’Azyr’s work on brain structures and functions was influenced by contemporary observations on human brain like Albrecht Von Haller, Vincenzo Malacarne, ect.
Also, I will show the epistemological and pedagogical dimension of Vicq d’Azyr’s studies on human brain.
By looking most specifically at the lavishly plates which illustrated Vicq d’Azyr’s treatise, I want to show his understanding of human brain anatomy may have been informed by pedagogical purposes.
I will also investigate to what extent Vicq d’Azyr’s discovery of the locus coeruleus was pivotal to the field of human brain anatomy.
Finally, I want to show that Vicq d’Azyr was not only an anatomist but also a physiologist and how his concern for both structures and functions, and for normal and pathological observations of brain structures shape his work on human brain, and also his understanding of contemporary work.
Anthropo-Biology in the 1940s: Jakob von Uexküll, Norbert Wiener and Arnold Gehlen on the functional circle of inside-outside-relations.
Uexküll, Wiener and Gehlen focus on order-generating processes, perceptions and inside-outside-relations of living beings: Organic bodies receive impressions, transform them according to a certain scheme and possess different faculties of expression (as physical action and speech). In Theoretische Biologie (1920/1928) und Bedeutungslehre (1940), Uexküll describes the unity of the impression-transformation-expression- scheme as a functional circle that determines developmental, metabolic and perceptual processes. In Cybernetics (1948), Wiener analzyes feedback mechanisms that control the data import and export of the senses. Gehlen’s anthropobiology in Der Mensch (1940) is basically a scientific programme to explore the conditions of human existence and of human societies with the help of impression-transformation-expression- schemes. However, Gehlen is mainly interested in the symbolic usage of sensual data in higher animals and humans. For Gehlen, humans need to perform acts through symbols because they are not as much determined by impression-transformation-expression- schemes as other animals. Symbols translate impressions into meaningful expressions. In the paper, I will first compare the models of functional circles and inside-outside-relations of Uexküll, Wiener and Gehlen. In the second part, I will contextualize these models. Uexküll’s model of the functional circle is closely related to the experimental embryology of Roux and Driesch. Wiener often refers to the experiments of Ivan Pavlov on conditioned reflex arches. As Pavlov, he compares inside-outside-relations of living systems with telephone networks and signal stations. Gehlen’s anthropo-biology is strongly influenced by the writings of Max Scheler, Hans Plessner and Martin Heidegger. Scheler, Plessner and Heidegger argue against the operationalization of the human inside-outside-relation. However, Gehlen’s theory is also part of the debates within the life sciences to which refer Uexküll and Wiener.
Session: VII.3 Room: Newman D Time: Friday 16:30-18:00
Session: VII.10 Room: Peter Chalk 2.3 Time: Friday 16:30-18:00
Matthew Chew
Howard Chiang
Arizona State University, Tempe, AZ, United States
Princeton University, Princeton, New Jersey, United States
H.C. Watson and the Civil Claims of “British” Plants
Hewett Cottrell Watson (1804-1881) was an irascible proponent of phrenology who despised his father, “loved an argument the way a man who can ride well loves a gallop on a spirited horse,” and as a “slashing critic” alienated many of those with whom he interacted. But he was also Britain’s first and foremost authority on the distribution of plants, lynchpin of the London Botanical Sociey and the Botanical Society and Exchange Club, coiner of the term phytogeography and creator of the “vice-county” system still employed by British conservationists. Emulating the methods of Alexander von Humboldt, Watson set out to compile not merely a complete list of British plants, but an authoritative accounting of all their occurrences. Realizing that his ambition was clearly beyond the abilities of one man in one lifetime, he organized Britain’s plant enthusiasts into a network of collector-observers. In the process he standardized herbarium practices and specimen criteria that remain essentially unchanged. But his reliance on amateurs inevitably led not only to mistakes and misidentifications, but also to intentional misrepresentations by some of his too-competitive correspondent-collaborators. Annoyed by the doubt this cast on his results, Watson composed the first biological definitions of native, alien and less self-evident intermediate conditions in order to justify eliminating suspicious records from his accounting. He continued to refine his system throughout his career, defending it against competing formulations proposed by fellow Briton Joseph Hooker and Swiss second-generation botanist Alphonse de Candolle. Watson’s native and alien survived to become familiar essentialist biogeographical labels, subject to repeated ad hoc redefinition, but his own conception of them was a nominalist response to the very practical problem of dubious data.
Separating Molecules, Building Biology:
The Evolution of Electrophoretic Instrumentation and the Material Epistemology of Molecular Biology, 1945-1965
Preparative and analytical methods developed by separation scientists have played an important role in the history of molecular biology (among other fields, especially biochemistry). One such early method is gel electrophoresis, a technique that uses various types of gel as its supporting medium to separate charged molecules based on size and other properties. Yet, historians of science have paid relatively little attention to this material epistemological dimension of biomolecular science. A major exception is historian Lily Kay’s 1988 article on the solution-based electrophoretic apparatus developed by Arne Tiselius. In that article, Kay substantiates her assessment of the Rockefeller Foundation’s direct influence on the rise of molecular biology in the 1930s and 1940s. My work picks up where Kay’s analysis ends, tracing the historical development of gel electrophoresis from the mid-1940s to the mid-1960s, roughly the time when two- dimensional electrophoresis was invented from the merging of molecular-sieving electrophoresis (molecular separation based on the physical property of size) and isoelectric focusing (molecular separation based on the chemical property of charge).
Accordingly, this paper extends the historiographical thread that explores the relationship between modern laboratory practice and the production of scientific knowledge. Claiming that the early 1950s marked a decisive shift in the evolution of electrophoretic methods from moving boundary to zone electrophoresis, I reconstruct a trajectory in which scientists such as Oliver Smithies sought out the most desirable solid supporting medium for electrophoretic instrumentation. Biomolecular knowledge, I argue, emerged in part from this process of seeking the most appropriate supporting medium that enabled discrete molecular separation and visualization. The early 1950s, therefore, marked not only an important turning point in the history of separation science, but also a transformative moment in the history of the life sciences as the growth of molecular biology depended in part on the epistemological access to the molecular realm available through these evolving technologies.
Session: VII.1 Room: Newman B Time: Friday 16:30-18:00
One might think that our arguments commit us to a form of error theory according to which we are the victim of an illusion of moral facts (Michael Ruse, Richard Joyce). However, we will argue that moral activity is ruled in a way that permits us to think of norms as being objectively grounded.
Now one might think that our arguments commit ourselves to a form of error theory according to which we are the victim of an illusion of moral facts (Michael Ruse, Richard Joyce). Contrary to this view however, we will argue that moral activity is ruled in a way that permits to think of norms as being objectively grounded.
Christine Clavien1, Chloë Fitzgerald2
1) Université de Neuchâtel, Institut de philosophie, Neuchâtel, Switzerland, 2) University of Manchester, Department of Philosophy, Manchester,
United Kingdom
The Impossibility of Evolutionary Realism
The main question we would like to deal with in this paper is whether an evolutionary ethicist can consistently hold a realist position, that is, sustain that moral properties exist in the world independently of anyone’s beliefs about what is right or wrong. There are various ways of being a moral realist and we will argue that none of them are acceptable for an evolutionary ethicist for reasons internal to the evolutionary point of view. Some forms of moral realism have to be ruled out simply because they are obviously incompatible with an evolutionary approach (all forms of super-natural realism like Platonism or non-naturalistic realism). Two other forms are less obviously but nonetheless incompatible, although they have supporters among evolutionary ethicists.
We shall call the first form “crude naturalism”. According to this theory, moral values are natural properties that are independent of the constitution of human nature. For example, biological functionality could ground our moral assertions (William Casebeer). We will provide several arguments against these forms of evolutionary moral realism, the main objection being that they postulate the moral character of specific properties even though this postulate cannot fit with an evolutionary picture of the world.
By far the more promising account of evolutionary realism is what we shall call “human dependent naturalism”, according to which moral values are something like response-dependent properties. These theories aim to provide plausible evolutionary accounts of human species- specific feeling pattern (William Rottschaefer; Robert McShea) or desires (Larry Arnhart) in response to our environment. Moral values are then supposed to emerge out of this interaction between humans and the objects of their activities. However, we will produce several arguments against this way of conceiving moral values. Among them, we will show that, if we can provide a plausible evolutionary account of moral behaviour (in terms of feeling patterns, desires etc.), there is no need to defend moral realism. We will also refer to recent works according to which there are various evolutionary sources of our mode of evaluation which tend to produce incompatible judgments (Stephen Stich); these accounts clash with any realistic view of morality.
Session: IV.8 Room: Peter Chalk 2.6 Time: Friday 9:30-11:00
Carol Cleland
University of Colorado, Boulder, Colorado, United States
Could There Be Undetected Alternative Forms of Microbial Life On Earth?
The assumption that all life on Earth today shares the same basic molecular architecture and biochemistry is part of the paradigm of modern biology. As I shall argue, there is little theoretical or empirical support for this widely held assumption. Scientists know that life could have been at least modestly different at the molecular level and it is clear that alternative molecular building blocks for life were available on the early Earth. If the emergence of life is, like other natural phenomena, highly probable given the right chemical and physical conditions then it seems likely that the early Earth hosted multiple origins of life, some of which produced chemical variations on life as we know it. While these points are often conceded, it is nevertheless maintained that any primitive alternatives to familiar life would have been eliminated long ago, either amalgamated into a single form of life through lateral gene transfer (LGT) or alternatively out- competed by our putatively more evolutionarily robust form of life. Besides, the argument continues, if such organisms still existed, we surely would have encountered telling signs of them by now. As I shall show, these arguments do not hold up well under close scrutiny. They reflect a host of assumptions that are grounded in our experience with large multicellular organisms and, most importantly, do not apply to microbial forms of life, which cannot be easily studied without the aid of sophisticated technologies. Indeed, the tools currently used to explore the microbial world, viz., microscopy (with sophisticated molecular staining techniques), cultivation, and molecular biology techniques (such as PCR amplification of rRNA genes), could not detect an alternative form of microbial life if it
existed. In short, the possibility that the contemporary Earth is host to “shadow microbes” (as yet unrecognized forms of microbial life that differ in fundamental ways at the molecular level from familiar terran life) is worthy of serious scientific investigation, particularly given the profound philosophical and scientific importance that such a discovery would represent.
Session: XII.4 Room: Peter Chalk 1.1 Time: Sunday 9:00-10:30
Oregon State University, Corvallis, Oregon, United States
Triangulation, social location and ophthamology: Do you see what I see?
Most feminist epistemologists have argued that while the social location of knowers is relevant to the knowledge produced, this relevance does not radically foreclose the possibility of sharing epistemological standards between those who are working from different social locations. We can all (more or less) successfully communicate with each other about basic features of our shared world, triangulating between the actions of others and the features of our world that give those actions their meaning. The possibility of this successful communication significantly weakens the charges of conceptual relativism that have been aimed at feminist science studies. Still, triangulation speaks only to the possibility of successful communication. There are many documented instances when the possibility of successful communication is not realised in practice. Gender differences, intertwined with differences in other social features, seem to matter, even, or especially, to producing, communicating and learning scientific knowledge. But how much do they matter? And in what circumstances? My suspicion is that any answers to these questions will be highly specific to the scientific setting in question and that more empirical studies are needed to document the historical trajectories at work. I outline a preliminary study that focuses on a specific research setting, namely basic and clinical research on intraocular immune response. It is a research setting where, for a variety of reasons, basic triangulation is conceptually ambiguous and/or physically difficult, yet some objective constraints on intepretation can still be identified.
REFERENCES
Coleman, K.A., and E.O. Wiley. 2001. On species individualism: A new defense of the species-as-individuals hypothesis. Philosophy of Science 68:498-517.
John Collier
University of KwaZulu-Natal, Durban, South Africa
Session: V.3 Room: Newman D Time: Friday 11:30-13:00
Review of the Cohesion Concept of Species
The cohesion concept of species was introduced by E.O. Wiley in his text Phylogenetics (1981). It plays a major role in Brooks and Wiley, Evolution as Entropy, but does not depend on the hypotheses in that book, as it underlies them. The idea in Wiley’s work is closely connected to the idea of species as individuals (Coleman and Wiley 2001), but also has close connections to the evolutionary species concept. A somewhat divergent and more nominalistic approach to a cohesion concept of species was developed by Alan Templeton (1989). Unlike the Wiley cohesion, Templeton’s concept has been taken to support pluralism with respect to species. I will present a dynamical notion of cohesion (Collier 1986, 1988, 2003) developed to fit the Wiley concept, and examine the issue of pluralism within this context, with some criticisms of Templeton’s version, which misses some of the advantages of the Wiley version. If pluralism with respect to species is to be retained, it must be a very watered down version, such that Newtonian systems would also be pluralistic.
Sharyn Clough
Collier, John. 1986. Entropy in Evolution” Biology and Philosophy 1: 5-24.
Collier, John. 1988. “Supervenience and Reduction in Biological Hierarchies”, in M. Matthen and B. Linsky (eds) Philosophy and Biology: Canadian Journal of Philosophy Supplementary Volume 14: 209-234.
Collier, John. 2003. Hierarchical Dynamical Information Systems With a Focus on Biology. Entropy, 5 (2003): 57-78. Templeton, A. 1989. The Meaning of Species and Speciation: A Genetic Perspective, in D. Otte and J. Endler (eds), Speciation and its Consequences. Sunderland MA: Sinauer Assoc., pp. 3-27.
Wiley, E.O. 1981. Phylogenetics: The Theory and Practice of Phylogenetic Systematics. Wiley-Interscience, New York.
Session: III.10 Room: Peter Chalk 2.3 Time: Thursday 16:30-18:00
Session: XII.8 Room: Peter Chalk 2.6 Time: Sunday 9:00-10:30
Alexandra Cook
Gregory Cooper
University of Hong Kong, Hong Kong, Hong Kong
Washington and Lee University, Lexington, Virginia, United States
Between Praxis and Episteme: The Herbarium as Boundary Object
In my paper I use the herbaria of the Swiss philosopher Jean-Jacques Rousseau to explore how everyday knowledge communicates with scientific concepts, and how concepts of nature connect with standardized knowledge.
In the 1760’s and 1770’s Rousseau studied and practiced botany in a variety of contexts, while on the run in Switzerland and England (1763-7), and also after his clandestine return to France in 1767. He not only derived personal satisfaction from his nature forays, collecting and herbaria, but also made the acquaintance of a number of important figures in the botanical world, including Antoine Gouan, Carolus Linnaeus, the Jussieus, uncle and nephew, Joseph Dombey, Etienne Guettard, Marc-Antoine Claret de Latourrette, and the Duchess of Portland. He was particularly well- connected with leading lights of the French provincial Enlightenment in Montpellier and Lyon. Within this wide-ranging botanical network Rousseau was able to engage in fruitful exchange of correspondence, specimens and texts. His herbaria reflect these rich contacts. He likewise became an adept of the natural family system of Jussieu, as well as the artificial sexual system of Linnaeus.
In Search of Community Ecology
Community ecology seemed to come into its own as a distinct subfield in ecology some forty years ago. In the hands of G.E. Hutchinson, Robert MacArthur and others, and bolstered by developments such as niche theory and the theory of limiting similarity, community ecology seemed on the verge of answering Elton’s classic question of limited membership in terms of competitive interactions and patterns of resource use. The fledgling discipline reached its zenith with the publication in 1975 of a MacArthur festshrift by Cody and Diamond. Then the wheels came off. At a conference in Wakulla Springs fundamental presuppositions of the new approach were questioned — that populations where regulated by density-dependent factors, that competition was indeed the queen of ecological interactions, that the patterns being explained by the new community theory were in fact real. Since then, community ecology has not disappeared, but it has become rather polymorphic. The search for community assembly rules has chugged along with little notable success At the same time, much of the energy behind the classic paradigm has been redirected into a kind of multi-species population ecology whose history antedates the Hutchinson/MacArthur revolution. Other forms have appeared, including macroecology, historical/geographical approaches, biogeochemistry based approaches, the deployment of complex adaptive systems theory, and perhaps most remarkably, Hubbell’s neutral theory, which assumes that community patterns reflect a kind of ecological drift (where species interactions play no significant role). What, then, is community ecology? In a recent paper, Kim Sterelny has proposed a model for thinking about local ecological communities. The model includes dimensions for internal causal structure, boundedness, and the existence of emergent properties. The position of communities within this space indicates the kinds of community studies it would be appropriate to undertake. Sterelny is asking the right kinds of questions – but largely about the wrong kind of entity. To the extent that ecologists study such communities their questions tend to be about quite restricted subsets of species inhabiting these communities – restricted in terms of taxonomic status, guild membership, trophic level, etc. Again, another strand of community ecology – Brown’s macroecology – tends to focus on patterns at much broader spatial and temporal scales than the local community. The situation is similar with regard to the other morphs of community ecology. This paper sets out
Rousseau’s herbaria are relatively unknown and even less studied; some are lost, while others, notably those in Zürich, Montmorency, and Paris, are extant and vary in both style and organization. He made several herbaria from scratch, experimented with different paper sizes, and expanded an important herbarium given to him by Dombey, later a significant botanical explorer of Latin America. Rousseau followed Linnaeus’s dictum that a “Herbarium is better than any picture, and necessary for every botanist” (Phil. Bot., par. 18). He also wrote a lucid exposition of how to fabricate herbaria; this exposition is comparable to those of botanists such as Latourrette and Lamarck, although differing from theirs in various particulars.
My examination of the botanical writings and herbaria of Jean-Jacques Rousseau will show how the herbarium can operate as a “boundary object,” that is, an object that serves a variety of functions and purposes; it has meaning for Rousseau not only in the realm of ladies’ botany and botany proper, but it also serves as currency in reciprocal scientific exchange, as an artisanal artefact, as a representation of nature, and as an impetus to reverie.
a modified form of Sterelny’s ontological state-space approach and seeks to locate the various stands of community ecology therein. The goal is get a clearer picture of the ways in which the controversies surrounding community ecology are grounded in (1) genuine disagreements about alternative ways to study a given class of phenomena (as opposed to an unrecognized complementarity in the study of distinct but related phenomena), and (2) genuine disagreements about the ontological presuppositions. underlying the various approached to the study of ecological communities.
Metabolic circularity as a guiding vision for systems biology
Despite various theories to explain life that have been developed, the subject has attracted little interest among molecular biologists, and the development of biology during the past half century has been firmly based on reductionist analysis. The rapid growth of systems biology during the 21st century has made surprisingly little difference, because much of the activity has been in the accumulation of detailed facts, with little integration into the understanding of whole organisms that will be needed for a proper understanding of life. In short, improved technology has permitted tremendous advances, and the difficulty of obtaining of data is no longer a limiting factor, but the subject has lacked a guiding vision, or a long-term objective. We suggest that Robert Rosen’s (M,R) systems may offer the guiding vision that systems biology has lacked until now. These address the fundamental question of how organisms organize themselves, maintaining their identities in the face of degradation of their components and changes in their environments. An apparent conflict between Rosen’s ideas and a thermodynamic view of organisms is his insistence that organisms are organizational closed, whereas it is now well understood that they are thermodynamically open. However, this conflict is only apparent, because two different levels of causation are at issue: the energetic needs of organisms require them to be open to material causation, whereas the need for all catalysts to be produced within them requires them to be closed to efficient causation. We have proved that the central idea of metabolic circularity is mathematically possible, and we have determined the limitations within which this organizational invariance is possible. Study of how a model (M,R) system might be defined indicates that
this closure will only be possible if some (probably many) components in an organism fulfil multiple functions: if correct, this would imply that the increasingly observed phenomenon of “moonlighting” is not just an interesting observation about organisms, but is an absolute necessity for their existence. However, the model that we have developed does not immediately explain how the replacement function is defined by the metabolic state of the organism. This too may be more of an apparent problem than a real one, because if catalytic cycles are written explicitly as sequences of chemical reactions the question of which catalyst catalyses which reaction does not arise.
Session: I.3 Room: Newman D Time: Thursday 11:30-13:00
Athel Cornish-Bowden,
Maria Luz Cardenas
Institut de Biologie et Microbiologie, Marseilles, France
Session: XI.4 Room: Peter Chalk 1.1 Time: Saturday 16:00-17:30
Lindsay Craig
University of Cincinnati, Cincinnati, OH, United States
Information and DNA: How The Unexplanatory Metaphor Explains
Biologists often use metaphors to explain complex phenomena in a more intuitive way. Such metaphors are meant to provide understanding of the natural phenomena under study, which, ideally, is the point of scientific work. By examining pervasive biological metaphors, such as the “information” metaphor of molecular biology, one can see that while metaphors do not provide substantive understanding of underlying mechanisms working at the molecular level, it is still possible for metaphors to be of some explanatory value. Examination of recently published work in molecular biology shows that there is no clear or concise understanding of the “information” provided by biological molecules. Nonetheless, I argue that the “information” metaphor does work as a valuable explanatory tool by providing an intuitive link between biological molecules and their role in nature. I argue further that metaphors in science generally are valuable explanatory tools despite their imprecise nature.
Time: Friday 16:30-18:00
Washington University in St. Louis, St. Louis, MO, United States
When Mechanistic Models Explain: The Hodgkin and Huxley Model of the Action Potential
In 1952, Hodgkin and Huxley published a mathematical model of the action potential in the squid giant axon. The model is derived in part from laws of physics and
Session: VII.2
Room: Newman C
Carl Craver
chemistry, such as Ohm’s law, Coulomb’s law, and the Nernst equation, and it can be used to derive myriad electrical features of many different kinds of neuron in many different species. Despite this accomplishment, Hodgkin and Huxley insist that their model fails as an explanation. This is curious if one thinks, as many did in the 1950’s, that to explain a phenomenon just is to show that it follows from laws of nature coupled with initial conditions. I argue that Hodgkin and Huxley regarded their mathematical model as a phenomenological model and that they regarded their understanding of the action potential as sketchy at best. I also argue that they were right. There is a widely accepted distinction between merely /modeling/ a mechanism’s behavior and /explaining/ it. Models play many roles in science. They are used to make precise and accurate predictions. They are used to summarize data. They are used as heuristics for designing experiments. They are used to demonstrate surprising and counterintuitive consequences of particular forms of systematic organization. But some models, such as the mechanistic model that continues to develop in the wake of Hodgkin and Huxley’s work, are also explanations. I will discuss some of the ways that models, even very useful models, can fail to provide explanations in order to build a positive account of when mechanistic models explain. I will also show how this mechanistic view of constitutive explanation differs from its nearest neighbors, such as Cummins 1975, Lycan 1989, and Machamer et. al. 2000.
Time: Saturday 11:00-12:30
Cassirer’s History of Hope
The central section of Ernst Cassirer’s The Problem of Knowledge is a history of biology and an extraordinary one. While it covers chiefly the 19th century and discusses almost all the figures we would emphasize now, the issues it considers (and on which Cassirer is astoundingly knowledgeable) are still very much with us. More importantly, he examines this material from a perspective that, as far as I know, is unique. Rather than considering a series of doctrines, i.e., products of research, Cassirer tries to uncover a series of ambitions for biology and of the corresponding strategies for getting there. What would an ideal biology look like? What kind of evidence and argument might be relevant and available for getting toward that ideal? These questions about the ideal were at least as important as the results actually achieved. Hope can be, as Cassirer shows, a very powerful force. I am not under the illusion that modern philosophers or historians of
biology, or modern biologists for that matter, will wish to copy Cassirer’s approach. But certainly it can expand the range of questions we ask and the repertoire of historical and scientific influences we are prepared to consider. In this way we can expand not only our understanding of 19th century biology but also of our own crafts as philosophers, historians, and scientists.
Session: IX.9 Room: Peter Chalk 2.5
A Reply to Naïve Mechanicism: J. S. Haldane’s Shift from Vitalism to Holism and its Effects on his Philosophy of Biology
This paper will not be discussing a popular hero of population genetics, J. B. S. Haldane. Rather, it considers the philosophical-scientific career of his father, John Scott Haldane. Born a Scottish aristocrat, he graduated from the University of Edinburgh Medical School and taught at Oxford, before developing novel approaches for dealing with poisonous gases during World War I. During his celebrated career in physiology, Haldane developed an unorthodox understanding of the relationship between the sciences and philosophy. He is a fascinating historical figure precisely because of his explicit philosophical commitments and his substantial contributions to the sciences of physiology and biology. Haldane both took philosophical questions very seriously and strove to accommodate his answers to them in his scientific studies. At first pass he seems to be an enigma as he disavowed the mechanistic outlook of his contemporaries while concurrently endorsing a somewhat mechanical approach to experimentation. During his long career, his philosophy shifted slightly from vitalism to holism. This paper seeks to explicate the philosophical content behind this terminological shift in light of Haldane’s life and work. This will be accomplished by carefully considering his conceptions of mechanism, vitalism, and holism in relation to his philosophy of biology.
Session: IX.5 Room: Peter Chalk 1.3 Time: Saturday 11:00-12:30
Thomas Cunningham
University of Pittsburgh, Pittsburgh, PA, United States
Richard Creath
Arizona State University, Tempe, AZ, United States
Session: VI.10 Room: Peter Chalk 2.3 Time: Friday 14:30-16:00
Taika Dahlbom
University of Turku, Turku, Finland
Specimens: Between Nature and the Zoological Gaze
Specimens, models and instruments are all participants to the material culture of science as well as the conceptual. Of these, only the specimen has been seen to embody the truth of nature since sixteenth-century
curiosity cabinets. In this paper I will present 300-year- long biographies of some zoological specimens currently located at the Zoological Museum of the University of Copenhagen in order to scrutinize in a historical frame, how the zoological gaze produces facts of nature from objects of natural origin, how these facts are related to the material object, and how the zoological and historical facts coexist in the material body of a specimen.
The biographical approach on the specimens highlights the specimens’ relation to truth and facts, as the truth about them is what the zoologist records. The material specimens are preserved to always stay the same, while the immaterial facts that a specimen is seen to present are, however, subject to change. If relevant new immaterial facts cannot be produced from the same artifactual entities, new specimens are needed. In its ability to represent not only the contemporary species, but signify historical species and facts as well, the specimen could be conceptualized as an emblem consisting of a material object which becomes artifactual as it is laden with the cultural significations the zoological gaze has seen it present. The contemporary reality of a specimen is, on one hand, a synchronical unison of the material presence and its contemporary zoological signification (species). On the other hand, a specimen is a diachronical continuum of material presence that carries former zoological significations and significations that come outside the zoological field.
structures as beautiful or as harmonious. Such expression may seem as subjective, out of (scientific) context, and are taken as scientifically distinct. However, looking more closely at the expressions, the scientific framework in which they had been articulated reveals the scientific significance of these expressions.
Often, applying the jargon of aesthetics to biological entities comes from students of the formation and development of biological structures. It is not surprising that the structural approach applied by those scientists is shared by some philosophers of aesthetics as well, an approach characterized by its formality. Another feature typical of structuralism is the analysis of research items (biological entities/ aesthetic entities) according to the inner relations of their parts. Thus, although biologists do not examine the aesthetic values of biological entities students of development and form that apply structural method refer to the inner relations of biological entities as harmonious and therefore as having an aesthetic value, unlike natural selection that picks the best functioning system in the cost of aesthetics value and harmony. Hence,
The aesthetic point of view of the structural approach to biological system sharpens the distinction between the functional and structural approach. Examples taken from Darwin, D’ Archy Thompson. Waddington, S.J. Gould and Stuart Kauffman would help to illustrate the point.
Thus, in addition to nature, specimen embodies history. The success or failure of a specimen within a zoological paradigm is realized in its ability to present or represent a truth of nature. Success allows the specimen to continue as a zoological specimen despite changes in its factuality. Failure leads to it transforming into a historical specimen, which, although unable to represent nature, is yet able to gather historical significations and thus represent the history of zoology, for example. In the everyday zoological practise, however, the coexistence of the historical factuality with the zoological factuality is and has been a complicated issue, as outside the artifactual sphere of zoology, the inclusion of the historical challenges scientific objectivity.
Session: VIII.2 Room: Newman C Time: Saturday 9:00-10:30
Lindley Darden
University of Maryland, College Park, College Park, MD, United States
Session: III.8 Room: Peter Chalk 2.6 Time: Thursday 16:30-18:00
Mechanisms in Biology in Hierarchical Context
The new mechanistic philosophy arose as philosophers of biology examined detailed scientific cases and analyzed the wide usage of the term “mechanism” by contemporary biologists (e.g., Machamer, Darden, Craver 2000; Bechtel 2006; Darden 2006). Insights have arisen from and been applied to an understanding of mechanisms in molecular biology, neurobiology and cell biology. Examples of molecular biological mechanisms include the mechanism of DNA replication and the mechanism of protein synthesis.
The growing philosophical literature on biological mechanisms addresses such issues as the following: the nature of theories in experimental biology, mechanistic explanation, and reasoning strategies for discovering mechanisms. Theories are viewed as sets of mechanism schemas. Mechanistic explanation consists of the instantiation of a mechanism schema at an appropriate “bottom out” level to adequately describe a mechanism
Naomi Dar
Independent Scholar, Modi’in, Israel
Are Biological Structures Aesthetic Objects?
The aesthetic value of biological entities is often mentioned as an argument for environment preservation. Out of this specific context the aesthetic evaluation of biological entities seems irrelevant. However, biologists frequently refer to biological
that produces the puzzling phenomenon to be explained. When what is sought is an explanatory mechanism, that goal provides much guidance for reasoning in discovery. Discovery is an extended process of generation, evaluation, and revision of mechanism schemas and sketches. The seemingly intractable problem in philosophy of science of understanding the discovery process has gained new purchase with the study of reasoning strategies for discovering mechanisms.
This talk will raise questions about how to situate such mechanisms into a wider hierarchical context. Looking down, we see that some macromolecules, such as ATPase, are referred to, as Michel Morange has noted, as “molecular machines.” How do they play the role of entities in higher level, containing mechanisms? Looking upward, how are such mechanisms to be situated in larger “systems” (a term now used by molecular biologists interested in studying such topics as proteomics)? How does this wider hierarchical context affect the analyses of theory structure, explanation, and discovery within the mechanistic literature?
explanations) cannot simply be subsumed under, e.g. the debates around preformationism/epigenesis, atomism/holism, or materialism/vitalism, but picks out two rival conceptions of biological causation (and indeed causal explanations in general) that have cut across the dichotomies mentioned above. I will attempt to locate a (probably quite arbitrary) selection of approaches to the problems of biological form within my framework and I hope ultimately to argue that a more adequate understanding of biological phenomena can be achieved by the acknowledgement of these rival approaches and the acceptance of the need to integrate them. I will further argue that the iterative procedure of combining systems theoretic and pragmatic systems biology, described by (but not it seems carried out by) Leroy Hood and others, (i.e. top-down and bottom-up systems approaches to biology) offers a possible way of carrying out this integration.
REFERENCES:
William Bechtel (2006) Discovering Cell Mechanisms: The Creation of Modern Cell Biology, New York: Cambridge University Press.
Philippe De Backer1, Dirk Gevers1,6, Kyung-Bum Lee2,5, Toshishiro Aono2, Chi-Te Liu2, Shino Suzuki2,
Tadahiro Suzuki2, Takakazu Kaneko4, Manabu Yamada4, Satoshi Tabata4, Doris M. Kupfer3, Fares Z. Najar3, Grahan B Wiley3, Bruce Roe3,
Darden, Lindley (2006), Reasoning in Biological Discoveries. New York: Cambridge University Press.
Hiroshi Oyaizu2, Marcelle Holsters1
Machamer, Peter, Lindley Darden, and Carl F. Craver (2000), “Thinking About Mechanisms,” Philosophy of Science 67: 1-25.
1) VIB/Ghent University, Ghent, Belgium, 2) Biotech. Res. Cent., Univ. Tokyo, Tokyo, Japan, 3) University of Oklahoma, Oklahoma, United States, 4) Kazusa DNA
Session: IV.3 Room: Newman D Time: Friday 9:30-11:00
Time: Friday 11:30-13:00
Distributed and Local Causation in Systems Biology
If we were to ask the question, “what is the cause of organism X having the form that it does?” or, “why does organism X give rise (through e.g. sexual reproduction) to an organism that is similar to itself?” we have two broad types of explanation: the localised and the distributed. Local causal explanations answer “why” (or “how”) questions with the identification of an object (localised in space) or event (localised in time) responsible for the phenomenon to be explained. Distributed causal explanations answer “why” (or “how”) questions with the identification of global properties (distributed in space) or processes (distributed in time) responsible for the phenomenon to be explained.
I will hope to show that the two categories that I am employing (i.e. distributed and localised causal
5
Res. Inst., Tokyo, Japan, Center for Information
Session: V.11
Room: Peter Chalk 2.1
Jonathan Davies
Biology, DDBJ National Institute of Genetics, Tokyo, Japan, 6) Massachusetts Institute of Technology, Massachusetts, United States
University of Exeter, Exeter, Devon, United Kingdom
Comparative and evolutionary genomics of Azorhizobium caulinodans as a case study for the workings in the post-genome era
Systems biology aims at unravelling the molecular mechanisms underlying the cellular responses that arise from the interplay between organisms and their environment. The use of computers and extensive databases, are also characteristics of Systems Biology. Integrating data is not evident, so often attention is turned towards prokaryotic organisms as their regulatory complexity is “simpler”. Here we present the result of a comparative genomics study into Azorhizobium caulinodans and use this as a means to critically reflect on the modeling and bioinformatics approaches currently used today. Azorhizobium caulinodans grows in the soil as a free-living organism but can also live as a nitrogen-fixing symbiont inside
root nodule cells of legume plants. The interactions between Azorhizobium and Sesbania rostrata has become a standard model for this type of nitrogen-fixing symbiosis. In a first step, the genome sequence is structurally annotated using both intrinsic and extrinsic information. Both approaches have their limitations and pitfalls, which will be discussed. A second step includes the functional annotation of the genome and proteome. A total of 4689 open reading frames (ORFs) have been identified that appear to encode polypeptides. 3720 (80%) of which have been assigned putative functions based on their similarities tot database sequences with assigned functions and 969 (20%) have little or no homology to sequences in public databases based on the Li-Rost criteria (30% overall sequence identity and have an aligned region of more than 150 amino acids). However, the assignment of function by homology gives only a partial understanding of a protein’s role within a cell. In order to increase the functional information on Azorhizobium’s proteome, proteins were assigned a COG classification. Both approaches for clustering proteins have their limitations, including coping with gene duplication, local vs global homology and multiple domain problems. A final step in our analysis so far is the detection of synthenic regions using i-Adhore. When placed in a phylogenetic framework, this analysis reveals the evolutionary history of Azorhizobium. Here also, the choice of parameters can influence the end- result significantly.A global discussion of the use of bioinformatics, problems, pitfalls, opportunities are necessary to advance in Systems Biology. The mathematical relevance and significance of parameter choice, choice of script and decision tree are all important aspects that deserve thorough thought when implemented and maybe even shed some light on the debate on holism and reductionism.
stations, however, that functioned in other ways – allowing or even enthusiastically encouraging field work. One of such stations was the one in Wimereux in northern France, founded by the naturalist Alfred Giard in 1874. Although much smaller than the station in Naples, Giard’s marine station was of great influence in its region, attracting important numbers of biologists from the universities of amongst others Lille, Paris and Brussels. Several of these biologists became propagators of Giard’s ‘éthologie’ (which, in his own definition, was ‘the science of the behaviour and the interaction of living beings with other organisms and with their environment’). As this ‘éthologie’ was virtually impossible without field work, Wimereux would become one of the European outposts of field biology during ‘the age of the laboratory’.
In my paper, I would like to address the important differences that existed among late nineteenth century marine stations by comparing Naples with Wimereux. The differences in attitude towards field research between these two stations were connected with divergences in architecture, in concrete arrangement and implantation of the station, in styles of management and in their relations with amateur scientists, tourists and fishermen. Dohrn and Giard not only had different conceptions about field biology, they also created very different kinds of marine stations. By studying the relation between the two, this paper will serve as a case study to analyse the interaction between research programmes on the one hand and the organisation and material culture of working places on the other.
Time: Friday 11:30-13:00
Session: V.6
Room: Peter Chalk 1.5
Miquel De Renzi
Session: I.11 Room: Peter Chalk 2.1 Time: Thursday 11:30-13:00
University of Valencia, Valencia, Comunitat Valenciana, Spain
Alberchian Variations on Evolutionary Palaeobiology
Many scientists were practitioners of both palaeontology and embryology during the XIXth century. Throughout the XXth century, there have been palaeontologists interested in developmental biology as well as developmental biologists interested in palaeobiology and Stephen Jay Gould and Pere Alberch were good examples. They both were some of the early founders of the evo-devo thinking and heterochrony was a shared leitmotiv for them. Alberch gave support based on development to the Eldredge & Gould’s punctuated equilibria theory. Alberch was a champion of the internalist metaphors in evolution whereas the punctuated equilibria were initially explained in terms of orthodox environmentalist neodarwinism.
Raf De Bont
University of Leuven, Leuven, Belgium
Between the Laboratory and the Deep Blue Sea: The Lab-Field Border in the Marine Stations of Naples and Wimereux
The rise of biological marine stations in the latter decades of the nineteenth century is often associated with the high tide of evolutionary morphology and the period of the ‘laboratory ethos’. Therefore it is generally assumed that field research in such stations was only of very limited importance (e.g. by Kohler, 2002). This image is based on the analysis of big marine stations like the international station in Naples led by Anton Dohrn or the Marine Biological Laboratory of Woods Hole in the United States. There were other influential
Nevertheless, Pere was immediately aware that the two main features of punctuated equilibria; i.e., morphological stasis of lineages and evolutionary jumps had a comfortable explanation based on the properties of development. He recognised a long tradition of research on development, whose main issues evidenced its great stability as a process. This stability would be an internal characteristic. Because of this, development would be hardly troubled by genetic change or random fluctuations of its own dynamics. In addition, development would be very resilient to changes in environmental parameters. However, accumulation of changes above certain thresholds would have dramatic consequences. Alberch suggested, as an original contribution, that this previous knowledge could be unified in terms of non linear dynamic systems theory. This would be a holistic conception against the reductionist point of view of many geneticists in that time. Accumulation of genetic novelties in isolated small populations would cross over thresholds for many individuals (jumps), resulting in important phenotypic discontinuities. After reproductive isolation, the new developmental pattern would be maintained because of its stability (stasis). In addition, morphologies could not be produced at random. Alberchs’s thinking influenced afterwards another very integrative and holistic approach to understand morphological evolution developed during the early seventies: the constructional morphology (now biomorphodynamics) due to the German palaeontologist Adolph Seilacher. In biomorphodynamics, the phylogenetic legacy constrains the variability and thus, the functional and adaptive possibilities and their indefinite improvement by natural selection. This was a criticism to the adaptationism. During the eighties, Alberch’s papers furnished arguments to several authors to interpret the phylogenetic legacy as partly due to developmental constraints.
Time: Saturday 14:00-15:30
Max Planck Institute for the History of Sciecne, Berlin, Germany
What is a philosophy of individuation? Simondon’s theory of the living
In 1960’s the French philosopher, Gilbert Simondon wrote three books: L’individu et sa genèse physico- biologique, L’individuation psychique et collective and Du mode d’existence des objets techniques. These three books had an important influence on philosophers like Canguilhem, M. Merleau-Ponty, G. Deleuze and more recently on B. Stiegler. The ambition of Simondon was to construct a new axiomatic based on the living
sciences which would concerns the physical as well as the biological individuation. His main proposition is that one should reverse the analyze of the individual: instead of taking the individual as a paradigm (either as a cause or as a finality of a process) one should focus on the process of individuation in itself. This shift implies, according to Simondon, a transformation of our modes of knowledge.
We will first focus on the “axiomatic” of Simondon (concepts of singularity, metastability and transduction) and on its approaches to the living, and secondly, we will develop its implication in contemporary theories of knowledge.
Time: Thursday 16:30-18:00
Session: III.1
Room: Newman B
Session: X.10
Room: Peter Chalk 2.3
Leo Baeck Institute London/University of Cologne, Koeln, Germany
Different research practices in early molecular genetics: Oswald T. Avery’s and Max Delbrück’s revolutionary findings and early responses
The demonstration by Avery and his younger associates Colin M. MacLeod and Maclyn McCarthy in 1944 that the substance capable of bringing about a lasting transformation of pneumococcal types that is apparently heritable changes in bacteria is DNA for the first time clearly associated a genetic phenomenon to a nucleic acid. It challenged the then generally accepted view that proteins were the material of genes. Avery and his associates’ discovery thereby became the basis of all further studies on the structure and genetic functions of DNA. Despite the revolutionary nature of Avery’s discovery, his research practices as well as his general attitudes were conservative.
Theoretical physicist Max Delbrück was motivated to conduct research on the concept of the gene and gene replication by Niels Bohr’s romantic vision that his physical concept of complementarity might be used to explain basic properties of life. However, Delbrück did not remain a theorist but developed methods to study gene replication. By using phage in order to tackle the questions of gene properties and replication he developed a new experimental and theoretical approach in genetics. Unlike Avery he became one of the leading figures in early molecular genetics and received a Nobel Prize (in 1969).
This paper will examine and compare Avery’s and Delbrück’s research practices and the early responses to their major work. This includes an analysis of Avery’s work by the “phage group” created by Delbrück, which comprised leading scientists in what would soon be
Didier Debaise
Ute Deichmann
called molecular biology. Surprisingly, despite their prominence in this new emerging field they widely neglected Avery’s paper. Delbrück did not cite it even once. It will be discussed to what extent the differences in the research practices of Delbrück and Avery may have been a major cause for this neglect. Thus a close observer of Avery’s and Delbrück’s work pointed to the fact that “certain members of the ‘phage group’ regarded the orthodox chemical approach to the understanding of biological phenomena as pedestrian, too slow, and not revolutionary enough for their intellectual ambition.” In addition, surprisingly, Max Delbrück and his colleagues belonged to those scientists who most strongly adhered to the dogma that genes were proteins. The paper aims to explain why Avery, a conservative scientist using long established chemical and microbiological methods was able to challenge this dogma.
evidence from the species under study and avoided broad generalizations. This comparison is explored as an instance in which gender may have influenced different scientific styles.
Session: VIII.7 Room: Peter Chalk 1.6 Time: Saturday 9:00-10:30
Richard G. Delisle
Philosophy, Université de Montréal, Montréal,| Québec, Canada
Session: I.4 Room: Peter Chalk 1.1 Time: Thursday 11:30-13:00
Unpacking the Evolutionary Synthesis: How Can So Many Epistemological and Metaphysical Issues Stand Within Such a Compact Explanatory Structure?
It is not uncommon to define the Evolutionary Synthesis as the acceptance of two main conclusions: 1) that gradual evolution is explained by small genetic changes acted upon by natural selection; 2) that speciational and macroevolutionary events are consistent with the known genetic mechanisms.
Although the Evolutionary Synthesis can be define slightly differently by broadening its definition, the fact remains that this general definition stems from selecting relevant theoretical and conceptual elements fitting the view of a compact explanatory structure among the writings of the founding fathers of Neo-Darwinism. A significant portion of the historical and epistemological analyses published since the late 1950s are conducted within this perspective of evolutionary biology.
It is suggested that a fresh perspective on the entity called Neo-Darwinism might be gained by reversing the approach, that is, by defining it on the basis of the place it occupies among the global vision of each founding fathers. This approach yields surprising results when comparatively applied to Ernst Mayr, Theodosius Dobzhansky and Bernhard Rensch. Whereas Mayr’s view stands squarely within the standard conception of Neo-Darwinism by deriving his understanding of all evolutionary phenomena from it, Dobzhansky’s and Rensch’s views, for two entirely distinct reasons, incorporate the neo-darwinian mechanisms in much broader explanatory structures – cosmic visions of evolution – in which these mechanisms are of secondary importance.
Not only our analysis reveals that the neo-darwinian mechanisms serve different roles in the explanatory structure of the various contributors of Neo-Darwinism, but it also raises epistemological and metaphysical issues usually believed to be entirely irrelevant in modern evolutionary biology.
Isabel Delgado Echeverria
universidad de Zaragoza, Zaragoza, Spain
Nettie Maria Stevens and the Controversy About Biological Sex Determination
Nettie Maria Stevens carried out important studies on the connection between chromosomes and sex determination around 1900. At the time sex determination was an area of some dispute, with the majority of scientists predisposed to explaining sex on the basis of environmental or metabolic factors. Stevens’s work resulted in cytological evidence indicating that the behaviour of specific chromosomes were connected with sex determination, which led to the modern concept of chromosomal sex determination and enabled the question to be linked to the newly discovered Mendelian laws of heredity. Her research on different species of insects, including Drosophila, provided technical innovations in microscopy and cytogenetics, which in turn were successfully used by Morgan’s team, and her findings on Drosophila chromosomes were fundamental to the establishment of the chromosome heredity theory.
This paper analyzes Stevens’s work in the context of different research lines, contrasting her approach with that of other cytologist and geneticists, including McClung, Wilson and Morgan. McClung’s argument about the role of the accessory chromosomes as bearers of male qualities is explored as an example of masculinity overvaluation, linked to modern genetic analysis of Y chromosomes. Such discourse starts from the assumption of a universal mechanism for sex determination in plants, animals and human beings. Stevens, by contrast, encouraged considering only the
Session: II.8 Room: Peter Chalk 2.6 Time: Thursday 14:30-16:00
Session: XII.11 Room: Peter Chalk 2.1 Time: Sunday 9:00-10:30
Julien Delord
David Depew
Konrad Lorenz Institute, Altenberg, Austria
University of Iowa, Iowa City IA, United States
Neutral theories and the unification of evolutionary biology
Ecology has recently seen the emergence of a new and controversial theory aimed at explaining the dynamics and assembly of species communities. The Unified Neutral Theory of Biogeography and Biodiversity (UNTBB) developed by Stephen Hubbell in 2001 relies on the assumption of neutrality (in terms of vital rates) among the individuals making up the community whatever their species. The species composition and abundance evolve only by stochastic processes (“ecological drift”) under structural constraints like the rate of speciation, the size of the community and dispersion limits. The principles and mathematical structures of UNTBB were directly inspired by the neutral theory of molecular evolution developed at the end of the 60’s by Kimura and Ewens among others. Beyond the debate about the explanation of the success of their theory to fit many biogeographical data, the zealots of UNTBB claim that their theory, like Kimura’s, possess important epistemological virtues: conceptual simplicity, a great level of generality, a great unifying power. This last statement will probably spark a great interest among philosophers of biology. Indeed, in biology attempts to unify different theories have consisted on one part to reduce a theory to another one (i.e. mendelian genetics reduced to molecular genetics) and on the other part to synthesize different theories together (i.e. the neo-darwinian synthesis). So far, only physical sciences have experienced “hard” unification between theories by identification of class of models with analogical mathematical structures (i.e. Maxwell’s electromagnetism). The development of independent neutral theories in biology both at the micro (genetic) and macro (ecological) levels could give the opportunity to construct a new unifiying evolutionary theory structured by a neutral mathematical model. The necessary condition would be to find bridge terms between the two domains in order to give a coherent interpretation of this new theory.
However, is it conceptually possible? Can these two theories really be unified? And is it sufficient to account for all evolutionary biology? Following Margaret Morrison’s analysis of theories unification, we will criticize recent attempts of unification, expose our analysis of the problem and discuss its limits regarding the general question to which degree evolutionary biology can become a “physical” science.
Dewey’s Darwinism and The Baldwin Effect
“The Influence of Darwinism on Philosophy” is the title of a famous l910 essay by John Dewey. Peter Godfrey- Smith has discussed it in Complexity and the Function of Mind in Nature (Cambridge, 1996). He rightly attributes to Dewey, in the context of the latter’s polemic against Spencer, a keen appreciation of what Godfrey-Smith calls “the environmental complexity thesis.” This is the idea that “ the function of cognition is to enable the agent to deal with environmental complexity.” Dewey was familiar with the general idea since days as doctoral student in biological psychology; it had already begun to be worked out by William James, whose self-description as a Darwinian psychologist, llike Dewey’s after him, was intended to mark strong opposition to Spencer. Dewey himself, however, did not become a Darwinian in this sense until 1884-6. Until then, he continued to insist, much to the annoyance of his graduate mentors, on encoding in a neo-Hegelian idealist conceptual framework his lasting belief that social interaction, rather than individual psychology, is the causal site at which mental phenomena form. In 1894-1896, however, Dewey suddenly changed jobs, abandoned his Christianity and idealist philosophy with it, and became a fully naturalized, Darwinian evolutionary psychologist.
This paper is, in one sense, a footnote to Godfrey Smith. On the basis of a closer reading of essays, book reviews, and (hopefully) letters from that period than Godfrey- Smith provides, I will show that the environmental complexity thesis was already up and running in Dewey in the mid 90s, albeit with his characteristic stress on social environments and continued quibbles with James’s individualism. More significantly, I will argue that the catalyst for Dewey’s sudden naturalistic turn was the intense press battle then raging among evolutionary psychologists about what would later called “the Baldwin effect.” According to this idea, the unpalatable anti-Lamarckian implications of what Dewey in 1894 called “Weismannism in its extreme form” could be circumvented by recognizing that spontaneous adjustments to environmental contingencies exhibited by infants could be differentially selected by reinforcement from parents and protracted across generations by imitation and other forms of learning. These “ontogenetic adaptations” could later be phylogenetically embedded by coinciding (rather than induced) changes in the germ line factors.
I will concentrate in this presentation on evidence
provided by a passage from Dewey’s l895 essay on “Evolution and Ethics.” It refers to “ those having competent knowledge of details” as “ having good reason [for saying that] not only is one form of life as a whole selected at the expense of other forms [for a population], but one mode of action in the same individual is constantly selected.” I will provide evidence that the competent people referred to were the “Baldwin boosters” of the day: Baldwin himself; Henry Fairborn Osborn, and especially Conwy Lloyd morgan, who discussed the relevant concepts at the University of Chicago during this period.
I will end with some reflections on how Dewey’s uptake of the controversy, even though it may have been somewhat incoherent, entered into the very foundation of “instrumentalism,” his particular version of pragmatism.
Time: Sunday 11:00-12:30
but not their order. This would happen in cases where altering slightly the order of colonization of species does not change the community structure. In its weakest form, a process is state-dependent if a path can be partitioned into a finite number of states containing all the relevant information. Lagged density dependent models in population ecology, where the next density is more accurately defined by the current and immediate past density (Nt+1= Nt f(Nt,Nt-1), are interesting examples of state dependence.
Session: II.1 Room: Newman B Time: Thursday 14:30-16:00
Oren Harman1, Michael Dietrich2
1) Bar Ilan University, Ramat Gan, Israel, 2) Dartmouth College, Hanover, NH, United States
Session: XIII.8
Room: Peter Chalk 2.6
On Rebels, Icons, and the Value of Dissent
From the analysis of eighteen leading iconoclastic biologists in the twentieth century, across fields in the Life Sciences that include human evolution, evolutionary theory, development, systematics, microbiology, biochemistry, physiology, genetics, neurobiology and brain sciences, cytology, virology, and ecology, we have tried to produce a ‘taxonomy’ of scientific rebellion and a ‘profile’ of the scientific rebel. Looking at variables such as whether or not the rebel operates within the consensus of his or her scientific community; whether or not rebels share a common methodology and course of action; whether they are ‘loners’ or rather attract like-minded iconoclasts and mount their challenges in groups; whether one can speak of a common disposition or temperament that characterizes scientific rebels - a Tolstoyan thesis emerges: While all conventional practitioners in the Life Sciences may be said to be conventional in the same way, all “rebels” seem to rebel in their own, particular fashion. The scientific “rebel”, it emerges, is almost and “anti- category”, or rather a “non-category.”
The failed attempt to pin-point the profile of the scientific rebel leads to a discussion of the different discernable ways in which icons are established. These, we shall show, are quite varied, and include the ensconcement of a particular kind of method, lack of imagination, pragmatism, or the belief in unassailable theoretical constructions.
Finally, we shall want to ask: What does it actually mean that an icon has been challenged? What are the different natures of challenges to iconic thought? Here we shall attempt a heuristic categorization that we hope can help to make sense of the phenomena of scientific rebellion and challenges to iconic thought.
Eric Desjardins
University of British Columbia, Vancouver, Canada
History Dependence in Ecology
Since the 1960s, many have claimed that history matters in ecology This historic turn has taken various directions, reflecting different understandings of ecological historicity. However, analyses of the historical nature of ecological processes remain very metaphorical. Borrowing from recent developments in political science and economics, this paper introduces a formal framework developing the notion of “history dependence,” and shows its potential application
to ecology.
A process is history-independent if the outcome during any period does not depend upon previous outcomes. MacArthur’s equilibrium theory of insular biogeography is one such kind of process. Conversely, a process is history-dependent if it retains information from its past. In a deterministic framework, history dependence reduces to dependence on initial conditions. In a stochastic framework, history dependence combines causal dependence and chance, and implies a shift in the probabilities of outcomes as a function of the past.
A finer unpacking of history dependence makes the notion more significant and empirically testable. After Page (2006), I identify three forms of history dependence, each varying in degree. In its strongest form, a process is path-dependent if the order of previous outcomes matters. I exemplify this with ecological assembly rules, according to which the order of colonization of species can affect the structure of a community (Diamond 1975). At the intermediate level, a process is phat-dependent when past outcomes matter,
Session: XI.5 Room: Peter Chalk 1.3 Time: Saturday 16:00-17:30
which they took place. More recently Robert Kohler has challenged this picture suggesting that Morgan was using selection experiments to test an evolutionary theory he proposed few years earlier. The experiments failed to meet their objective but resulted in a population expansion and whereby ever-increasing number of spontaneous mutations. These, notably the white-eyed fly, proved excellent materials for Mendelian experiments and drove Morgan into a new research domain. Kohler’s historical reconstruction surpasses the standard one but still is not entirely adequate. It is argued that observed new phenotypes were not mutations, let alone spontaneous, but rather recessive alleles that have accumulated due to an inbreeding program purported to establish a compilation of pure- lines. Furthermore, most “mutants” proved unfit for Mendelian experiments and all, save the white-eyed fly, did not furnish a confirmation of the chromosomal theory of heredity. Hence what facilitated the production of knowledge were not the material aspects of the system but the ways the latter were exploited and interpreted in light of vexing problems and novel hypothesis.
Time: Saturday 14:00-15:30
Christopher DiTeresi
University of Chicago, Chicago, IL, United States
Making Developmental Biology Second Nature: Graduate Courses as Scaffolding for Disciplinary Inheritance
Wimsatt and Greisemer (2006) have recently argued that development must be given a central place in any adequate theory of cultural evolution. In order to focus on the relation between cultural reproduction (in this case disciplinary generational reproduction) and the scaffolding of individual development, I present a case study of a graduate course in vertebrate developmental genetics at the University of Chicago. I aim to show how the course cultivates an integrated package of mutually reinforcing disciplinary skills. These include: conceptual skills (visualization and manipulation of mental models), articulation skills (specialist reading and writing), critical reasoning skills (experimental design & weak design detection), socio-professional skills (preparing and giving presentations). In addition to these relatively portable skills, students are also introduced to decidedly local features such as axes of work alignment and co- operativity within the department and university. In order to illustrate how this integrated organization functions as scaffolding, I present and discuss a visualization exercise used at the start of the course to help students conceptualize vertebrate gastrulation.
REFERENCES
Wimsatt, W. C. and J. R. Griesemer, 2006, “Re-Producing Entrenchments to Scaffold Culture: The Central Role of Development in Cultural Evolution,” in Roger Sansom and Robert Brandon eds., Integrating Evolution and Development: From theory to practice, MIT Press.
Session: X.6
Room: Peter Chalk 1.5
Stephen Downes
University of Utah, Salt Lake City, UT, United States
Session: VIII.4 Room: Peter Chalk 1.1 Time: Saturday 9:00-10:30
Life History Theory, Optimality Modeling and Evolutionary Explanation
In the context of arguing that Evolutionary Psychologists should adopt the methods and pay attention to the results of life history theory, Kaplan and Gangestad claim that “as originally conceived LHT [Life History Theory] concerned the timing of events. Increasingly, however, biologists have found that the understanding of phenomena not traditionally thought of as life history events in fact requires an explicit life history approach. Hence, LHT has increasingly subsumed cost-benefit analysis in many areas. Rather than being defined by the phenomena it explains, LHT is a general analytical approach to understanding selection” (2005, 77). They make two strong claims: life history theory subsumes optimality modeling and life history theory is a general approach for understanding selection. In this paper I assess these two claims and endorse weaker versions of both of them: life history theory is an important species of optimality modeling and that life history theory does contribute to our understanding of evolution. Life history theory is not the only general approach that should be used to understand selection but explanations derived from life history theory are an important component of evolutionary explanation. Further, I argue that
Igal Dotan
Max Planck Institute for the History of science, Berlin, Germany
Interrogation of a Fly
In 1908, prominent American biologist Thomas Hunt Morgan (1866-1945) initiated an experimental breeding program using the fruit fly Drosophila melanogaster. After two years of futile attempts to induce mutations he spotted a white-eyed fly that soon would win him worldwide fame as the founder of neo-Mendelism. The fly was followed by other mutants, all appearing within few months to radically transform the course of research. For many years these remarkable events were explained away as serendipitous occurrences not related in any conceivable way to the experimental system in
explanations derived from life history theory lend support to the “anti-reductionist consensus.” Reductionists, such as Alex Rosenberg (See e.g. 2006), argue that life history theory explanations are incomplete and completed only when the relevant reduction to molecular biology is provided. A close examination of some life history theory explanations indicates that their explanatory capital will not be enhanced by this kind of reduction.
REFERENCES.
Kaplan, H. S. and S. W. Gangestad (2005). Life History Theory and Evolutionary Psychology. The Handbook of Evolutionary Psychology. D. Buss. Hoboken, NJ, Wiley: 68-95.
Session: XII.5 Room: Peter Chalk 1.3 Time: Sunday 9:00-10:30
Rosenberg, A. (2006) Darwinian Reductionism: Or, How to Stop Worrying and Love Molecular Biology. Chicago: University of Chicago Press.
Two Requirements for the Concept of Genetic Drift
The standard philosophical interpretation of population genetics theory holds that genetic drift and natural selection are causes of evolution. Matthen and Ariew (2002) and Walsh, Lewens, and Ariew (2002) have attempted to problematize this interpretation. They claim that drift and selection are not causal. Rather, they are statistical summaries of various individual level causal events that cannot themselves be identified as drift or selection.
There have been several responses to these papers, most taking what I will call a conceptual approach. The conceptual approach presents arguments about the nature of the concepts of drift and selection that purport to show they are causal. At least one response, however, due to Reisman and Forber (2005), takes what I will call an epistemological approach. They argue that biologists commonly identify drift as an evolutionary cause through controlled experiments and that they are justified in doing so. Reisman and Forber employ the manipulability theory of causation to identify a particular manipulation as the cause of a particular evolutionary change. They assert that this kind of experimental manipulation is a drift manipulation. However, in order to successfully show that drift can be a cause, we need to demonstrate that the manipulation Reisman and Forber discuss was in fact a drift manipulation. And to do this, we need to have a concept of drift that fits the experimental case.
I argue that the concept of drift that fits this experimental case must meet two requirements: (1) drift is distinct from natural selection and (2) the strength of drift can vary. I examine Millstein’s (2002) account of drift and find that it satisfies (1) but fails to satisfy (2). On the other hand, an account of drift like Brandon’s (2005) fails to satisfy (1) but does satisfy (2). I conclude there is a tradeoff between the distinctness of drift and selection and their ability to vary in strength. Therefore, it seems unlikely that epistemological approaches like Reisman and Forber’s will be successful in showing that drift and selection are causes of evolution.
Time: Thursday 16:30-18:00
Dan Simberloff and Methodological Succession in Ecology
When the “Era of Ecology” dawned in the middle of the last century, ecology was in transition from its naturalist roots to a more modern, theory-based science. The time-scale of ecological processes made experimentation a less established tradition than in other sciences, especially in the area of community ecology, the study of the interactions of ensembles of species. Much theory in community ecology of the day might best be described as “ecopoetry”: high in interest, low in rigor. Community ecologists, however, found themselves in the public domain of wildlife conservation applying a perhaps ecopoetical theoretical framework established in great part by an icon in the field, Robert H. MacArthur, who has been described as the “James Dean of ecology.” Into the confusion stepped Dan Simberloff. The logic of his thinking—the result of mathematical training combined with childhood exposure to experimentation—caused Simberloff to confront both his own ground-breaking doctoral research under E. O. Wilson and the ghost of MacArthur in the form of his successor, Jared Diamond. Simberloff and the colleagues he drew to him persevered through a storm of criticism to alter fundamental methodology in community ecology and evolution and help usher in the ecology of today.
Session: III.1 Room: Newman B
William Dritschilo
Independent Scholar, Proctor, VT, United States
Matthew Dunn
Indiana University, Bloomington, IN, United States
Session: I.6 Room: Peter Chalk 1.5 Time: Thursday 11:30-13:00
problematises naïïve assumptions about multicellularity. First, many multispecies microbial communities meet intuitive criteria for being individual organisms: they are highly integrated and the parts exhibit an elaborate and effective division of labour. But if, as the preceding point suggests, we relax the monogenomic criterion on being an individual organism, it is difficult to see why the 90% of microbial cells that constitute a human, for example, should not be considered part of the individual human organism. Similar considerations apply to other multicellular eukaryotes, or ‘macrobes’ as I suggest they might be called. Such a conception of the organism would have radical consequences for models of evolution and, in particular, for views of the units of selection.
Jean-Claude Dupont1,2
1) Universtity of Picardie, Amiens, France, 2) IHPST, Paris, France
The history of integration: from Spencer to Sherrington and later
Integration is now very successful within the field of life sciences: neuroscience, physiology, and a substantial part of biology equally claim to be ««integrative»». The term appears today in the title of scientific journals: Integrative Biology, Integrative and comparative Biology, OMICS: A Journal of Integrative Biology, etc. What is crucial in this kind of context is the contrast between the element and the complexity involved in the functioning of an organized system, which gives to the word “integration” a rather wide meaning.
Session: X.9 Room: Peter Chalk 2.5 Time: Saturday 14:00-15:30
However, this situation results from a complex genealogy. Having initially a mathematical sense, just as the concept of function, the concept of integration became a theoretical tool in the field of the life sciences in the second half of XIX-th. century. There it met simultaneously Herbert Spencer’s evolutionism and Claude Bernard’s physiology. It acquired a more specific meaning in the neurology of John Hughlings Jackson, David Ferrier, and especially Charles Scott Sherrington. The objections to the sherringtonian paradigm did not prevent the development of the notion in general physiology, especially at John Barcroft’s instigation.
One often underlines that there is a convergence between the guiding plans governing the method of these physiologists, and the systemic approach of the function concept in the philosophy of biology. We propose here to investigate the successive rectifications of the concept of integration, which in each case expresses a different vision of the body and is related to a successful strategy aiming at the understanding its physiology.
York University, Toronto, ON, Canada
Ruthanna Dyer
Room: Peter Chalk 2.1
Ontology from the Microbe’s Point of View
Microbes, single-celled organisms have been the only living forms for 80% of the history of life, and remain the dominant living form to this day. Yet our idea of a biological organism remains largely grounded in a paradigm of a multicellular, monogenomic eukaryote.
Bringing microbes into this picture seriously
Learning through Glass: Henry Ward and the Blaschka Glass Animals in North America
The Blaschka Glass Animals are finely crafted models of marine invertebrate organisms which were originally produced in the late 19th century for collectors of natural history objects. By the last decades of the century, the models were being marketed to educational institutions. Ward’s Natural History in Rochester, New York was the licensed agent for the models in North America and began advertising them for use in the classroom and in curated collections. From the historical records of purchase and examination of sources utilizing the models, connections to practices in biology education are elucidated. This study integrates the history of the models, history of invertebrate biology, and history of secondary school education in an attempt to situate these objects in the utility of the biology classroom in the extended 19th century. The secondary school biology classroom of the period was a theatre in which morality plays of the day were enacted within the context of natural history and biological concepts of physiology, heredity and evolution. Biology was seen primarily and principally as a discipline to encourage and inculcate life-skills and values of a progressive society. The Blaschka marine invertebrate models, however, represented a static concept of the organism as an object based on the artistic works of Haeckel and other 19th century natural historians. This apparent discontinuity between the structure of the models and the goals of the progressive biology curricula is explored.
Session: VIII.11
Time: Saturday 9:00-10:30
John Dupré
University of Exeter, Exeter, United Kingdom
Session: X.10 Room: Peter Chalk 2.3 Time: Saturday 14:00-15:30
Session: II.5 Room: Peter Chalk 1.3 Time: Thursday 14:30-16:00
Thomas Ebke
Sophia Efstathiou1,2
Universität Potsdam, Potsdam (Brandenburg), Germany
1) University of California San Diego, La Jolla,
United States, 2) London School of Economics, London, United Kingdom
Norm and Limit: Between Helmuth Plessner And Georges Canguilhem
My contribution aims at a systematic comparison of the philosophies of biology put forward by the German philosophical anthropologist Helmuth Plessner (1892- 1985) and the French historical epistemologist Georges Canguilhem (1904-1995). The relationship between these two theories is characterised by significant points of contact as well as diametrical oppositions. As far as similarities are concerned, both thinkers advance the idea of an inherent type of normativity that specifies the living organism in difference to inanimate objects. According to Plessner as well as to Canguilhem, organisms bring about highly individual forms of normality in the interaction with their vital milieus. Having this notion of the organic in common, both theories stress the frictions and intermittences between living things and their role as objects of scientific description and analysis. In this perspective, the relation between life and the life sciences is much less to be conceptualized with a view to total knowledge, but rather a moment of non-knowledge or nescience that is inherent in the living itself.
In my presentation, I will argue that this common starting point paves the way for two highly heterogeneous philosophies of biology. Whereas Plessner aims to show that understanding, describing and interacting with the self-regulatory, normative acts of the living implies the idea of a philosophical anthropology, Canguilhem illustrates the ways in which different approaches towards the normativity of the living brought about breaches in the history of the life sciences. An important aspect of my thesis is that Plessner did not elucidate in full detail the complex reciprocities between living things and the sciences that deal with them (Canguilhem’s project), while Canguilhem did not flesh out any systematic difference between human and non-human organisms (Plessner’s project). Concluding my discussion of these two wrongly obliterated authors, I will raise the following problems: Is it convincing to speak of the “normativity” of an organism? If so, what consequences do organic norms and the problem of “non-knowledge” have on the formation of the life sciences? To what extent is a contemporary philosophy of biology reliant on anthropological thought?
Found Science: Found ‘race’ in Science
Results in population genomics contradict Michael Root’s argument that race is a bad proxy for human genetic variation (Root, 2003). Recent studies suggest that US race and ethnicity categories are good proxies for human genetic variation by first associating genetic variation with differences in ancient geographical ancestry (Rosenberg et al. 2002, 2005) and recently associating human genetic variation, with self-identified race/ethnicity categories (Tang et al. 2005).
I examine the data-mining method used to pick out this reported structure (Pritchard et al. 2000). I argue that though Root’s argument stands contested, what structure Tang et al. (2005) choose to report is contingent. The structure these scientists report is not selected by the data model applied on their data. But this does not have to reduce the scientific worth of their findings.
The genetic structure reported to accord with race/ethnicity categories is what I call a ‘found’ scientific object. By analogy to found art objects, this pattern of genetic variation is a found scientific object: it brings the ready-made category ‘race’ into a context of
interests (epistemic and pragmatic ones) and stamps it by the authorship(s) that mark it as science.
REFERENCES:
Falush Daniel, Matthew Stephens and Jonathan K. Pritchard (2003), “Inference of Population Structure Using Multilocus Genotype Data: Linked Loci and Correlated Allele Frequencies”, Genetics 164: 1567-1587 (August 2003)
Pritchard JK, Matthew Stephens and Peter Donnely (2000), “Inference of Population Structure Using Multilocus Genotype Data”, Genetics 155, 945-959 (June 2000)
Rosenberg et al. (2002), “Genetic Structure of Human Populations”, Science (2002) 298: 2381- 2385
———— (2005) “Clines, Clusters, and the Effect of Study Design on the Inference of Human Population Structure”, PLoS Genetics, www.plosgenetics.com, Dec 2005, Vol 1, Issue 6 e70 pp 0660-0671
Tang Hua, Neil Risch et al. (2005), “Genetic structure, Self- Identified Race/Ethnicity, and confounding in case-control association studies” Am. J. Hum. Genet. (2005) 76:268–275
Session: II.5 Room: Peter Chalk 1.3 Time: Thursday 14:30-16:00
Session: II.3 Room: Newman D Time: Thursday 14:30-16:00
Yulia Egorova
Kevin Elliott
Durham University, Durham, United Kingdom
University of South Carolina, Columbia, SC, United States
DNA Evidence? Genetic anthropology and history
Recent genetic studies aiming to reconstruct the history of human migrations made a claim to be able to contribute to the writing of history. This past decade witnessed genetic research on African American and Native American ancestries, the migration of the Maoris and the origins of Indian Jews, to name just a few examples. These tests have raised some important questions about the ethical implications of this type of research and the effect that it may have on the identity of the tested communities, issues that inspired a number of anthropological and sociological studies. But are historians aware of the findings in genetics that are relevant to their work? Do they accept this research as historical evidence? Has genetics had any effect on how historians write about the past? This paper explores these questions by looking at the responses of relevant historians and social scientists to genetic research on the origin of the caste system and of the Jewish communities.
Varieties of Exploratory Experimentation in Nanotoxicology
One of the most exciting and challenging aspects of nano-level science and technology is the potential for nano-engineered materials to exhibit unpredictable properties. This paper examines the research processes by which scientists study and systematize these unpredictable characteristics of nanoparticles. In particular, it focuses on the developing sub-field of nanotoxicology, which seeks to investigate the toxicological properties of these materials. Some evidence suggests that nanoparticles could have a variety of unpredictable toxic effects as a result of properties such as high surface-area-to-mass ratio, increased ability to move through biological systems, and an increased potential to adhere to other substances (both biological materials such as proteins and also other toxins). A working group led by the high- profile nanotoxicologist Günter Oberdörster recently published an extensive set of suggestions and guidelines for screening nanoparticles to develop a better understanding of their toxicological characteristics.
Using the Oberdörster screening plan as a case study, the present paper argues for two main claims: (1) current research in nanotoxicology can be fruitfully described as a form of exploratory experimentation, and (2) philosophical reflection on nanotoxicology can provide new insights concerning the nature of exploratory experimentation. First, the screening activity proposed by the working group fits previous descriptions of exploratory experimentation very well. For example, much of the experimental activity associated with nanotoxicology in general and the screening plan in particular is arguably directed not toward testing high-level theoretical frameworks but rather toward discovering regularities and identifying concepts that can make sense of those regularities. Second, because there is such a range of exploratory activity associated with nanotoxicology, examining this area of research can arguably improve previous historical and philosophical descriptions of exploratory experimentation. For example, the nanotoxicology case study suggests that it would be helpful to distinguish different sorts of exploratory experimentation, including novel varieties that involve computer simulation.
Session: III.7 Room: Peter Chalk 1.6 Time: Thursday 16:30-18:00
Christopher Eliot
Hofstra University, Hempstead, NY, United States
Ecological Mechanisms
Current, prominent accounts of mechanisms in biology—particularly Glennan’s and Machamer, Darden, and Craver’s (MDC)—primarily treat mechanisms inside organisms, in accord with a traditional view of mechanisms as being, like machines, compact, physically-integrated systems. Skipper and Millstein, however, have recently extended the explanatory use of mechanisms to treat natural selection as a mechanism. This paper examines plant ecologist David Tilman’s use of mechanisms to explain plant succession. I argue why the phenomena he identifies are appropriately considered mechanisms, despite that they do not count as such under the Glennan, MDC, or Skipper/Millstein definitions. This requires arguing that a phenomenon or process can be a mechanism despite not having a single, repeatedly- produced outcome. All this serves both to improve our understanding of how mechanisms can explain, and to expand our understanding of explanatory strategies for addressing complex, multi-causal phenomena such as are found in ecology (and perhaps also meteorology, epidemiology, etc.).
Session: VI.10 Room: Peter Chalk 2.3 Time: Friday 14:30-16:00
Session: I.5 Room: Peter Chalk 1.3 Time: Thursday 11:30-13:00
Rebecca Ellis
George Ellison, Simon Outram
Lancaster University, Lancashire, United Kingdom
St George’s - University of London, London, United Kingdom
DNA Bar-coding: a mere tool or the potential to remake our relationship with life?
In 2003, a team of taxonomists and bioinformaticians published their claims that one gene segment (CO1) could be used to characterise and identify (‘barcode’) most if not all of the planet’s higher animal species.
A way had been found to standardise species identification, alongside speedily mobilising digital global access to taxonomic data. Reactions to the news from the taxonomic community were ambivalent and hopes for a taxonomic revolution were tempered by concerns about reducing species identity to the selection of 648 nucleotide base pairs. For some however, this shift in taxonomic practice has been hailed as ‘an enterprise that promises to remake our relationship with life’.
This paper draws on recent interdisciplinary research undertaken by a team of sociologists and anthropologists of science at Lancaster University, in collaboration with the Botany Department at the Natural History Museum in London. It will explore differing characterisations of the taxonomic potential of ‘bar-coding’ and the implications these might have for possible taxonomic futures. For some, the meaning read into a single gene segment is simply rapid and accurate species identification, whilst for others a ‘barcode’ can divulge evolutionary relationships. In a sense, this exploration draws on and contributes to ongoing (and historical) debates within the taxonomic community, concerning the relationship between pattern and process. Inspired by recent anthropological analyses of ‘biocapital’ (e.g. Rose 2006, Sunder Rajan 2006), one of the particular interests of the research team is to understand how ‘life’ as capital becomes enrolled as taxonomists, together with bioinformaticians and new public and policy users of taxonomic science, negotiate the different kinds of latent value (horizontal mapping versus genetic ‘memory’) read into single gene segments or ‘bar-codes’.
The Business of Racial Criticism in Biomedical Research
Despite what many considered the definitive UNESCO statements on ‘race’ during the early 1950s, racial categories have continued to be used as analytical variables in biomedical research. This has sustained a growing body of academic work critical of ‘racial science’ which has engaged some of the greatest biologists of the 20th century, including: Stephen Jay Gould, Richard Lewontin and Jared Diamond. It has also provided an opportunity for communicating complex scientific issues to a wider audience of scientists and lay people – not least those relating to the evidence required to establish validity and causality, and the nature of gene- environment interactions. However, the ‘business of racial criticism’ has largely failed to discredit racial categories as markers of innate genetic difference, and some have argued that paying attention to race has only served to reify its importance in the eyes of scientists and lay people. This paper will aims to understand why biological notions of race remain as much an issue at the beginning of the 21st century as they did at the beginning of the 20th. It will draw on a systematic review of published criticisms and concerns about the use of racial categories which suggests that these continue to be poorly conceptualised, defined and operationalised in biomedical research, and are rarely valid markers for genetic variation. Despite these concerns a subsequent review of biomedical and genetics journal guidelines found very few that sought to improve the use of such categories. Indeed, interviews with genetics journal editors found there to be limited experience or understanding of the problems racial categories pose, and little willingness to develop appropriate guidelines beyond engaging the research community in further debate. Moreover, these interviews revealed how geneticists have been able to co-opt or circumvent the key criticisms levelled at racial categories as markers of genetic variation, by: (i) accepting these as social constructs yet adopting quasi- racial ‘ethnic’ categories to improve their apparent palatability, salience and analytical value; and (ii) distancing the scientific use of these categories from their social meanings and value. Geneticists seem to be able to achieve this slight of hand because, first, they ascribe to an untheorised approach to science in which the utility of scientific tools is sufficient to validate these as real and, second, they operate within a scientific sub-culture which renders their work immune
from social criticism. This paper concludes that the business of racial criticism might only succeed when the criticisms and concerns address the untheorised nature and powerful subculture of biomedical research rather than the fallibility of ‘race’ per se.
Session: XI.11 Room: Peter Chalk 2.1 Time: Saturday 16:00-17:30
Darwinism and the ever changing definitions of the ‘inheritance of acquired characteristics’
It has often been assumed that the term ‘inheritance of acquired characteristics’ has referred solely to a ‘Lamarckian’ mechanism of speciation. However, more recent work suggests that a form of ‘inheritance of acquired characteristics’ was present not only throughout Darwin’s work (most notably with Pangenesis) but also in that of Weismann.
‘Darwinism’ towards the end of the nineteenth century was a much contested terminology: its usage played out in the complex relations between theorists such as Romanes, Weismann, Wallace, Butler, and Spencer. The key to understanding the role of groups defining themselves as either ‘Darwinian’ Darwinians, ‘pure’ Darwinians or Neo–Darwinians lies in their understanding and acceptance of the differing mechanisms of the inheritance of acquired characteristics. If we accept that key theorists in these debates, for example Weismann, were not as resolutely committed to a panselectionist stance as is usually portrayed, then the role of the terms ‘Darwinism’ and ‘inheritance of acquired characteristics’ at the turn of the last century was far more complicated than previously thought.
In this paper then I shall explore the appropriation of the terms ‘Darwinism’ and ‘inheritance of acquired characteristics’ in the 1880s – 1890s to represent various very different theories of evolution, and all their differing factions. In particular, I shall focus on an ongoing debate that was played out in the letters pages of Nature in 1894 between Sir Edward Fry, Lankester, Cunningham and Poulton concerning the definition of the term ‘acquired characters’ in Weismann’s work. This highlights the ongoing problem that the term ‘inheritance of acquired characteristics’ is actually an umbrella term which has periodically been used to represent at least two distinct mechanisms of speciation.
Session: VII.5 Room: Peter Chalk 1.3 Time: Friday 16:30-18:00
John Emrich
Session: IV.11 Room: Peter Chalk 2.1 Time: Friday 9:30-11:00
The George Washington University, Washingtion, DC, United States
The Laws of Life: The First Patent of an Engineered Life Form
Today we regularly hear about new genetically modified food, patents on genes, and well as a host of different collisions of science, business, and the law. The patenting of life is a very contentious issue among scientists, ethicists, lawyers, and, of course, the general public. The origins of patenting genetically engineered organisms followed quickly in step with the advances of recombinant DNA in the late-1970s when Ananda Chakrabarty, a General Electric researcher, created a bacterium containing four strains of pseudomonas bacteria capable of “eating” oil faster than any found in nature. Since the bacterium he created met the U.S. legal standard of “any new and useful process, machine, manufacture, or composition of matter,” Chakrabarty and GE submitted an application for a patent, which was quickly met with a lawsuit contesting the patentability of life. The patent case quickly worked it way to the Supreme Court where it was ruled in 1980 that the “new life form” was patentable. The decision opened the floodgates to the patenting of any new organism created via genetic engineering. This paper will explore the origins of the first U.S. patent on a unique “invented” life form and the repercussions in the scientific, business, and legal communities.
Fern Elsdon-Baker
Independent, Leeds, United Kingdom
Marc Ereshefsky
University of Calgary, Calgary, Canada
Behavioral Homology and Psychological Categories
In a series of publications, Griffiths and Matthen suggest that some psychological categories (such as fear) are homologies. They argue that studying such categories as homologies rather than as analogies provides a richer understanding of those categories. Griffiths and Matthen’s approach to psychological traits is a promising one, however much of their discussion is limited to drawing parallels between morphological homologies and psychological traits. The present paper turns to recent work in ethology to draw parallels between behavioral homologies and certain psychological traits. The paper illustrates how theoretical ideas and operational criteria from ethology can be applied to psychological traits.
The benefits of applying ideas from ethology to psychological traits are several. First, behavioral homologies and certain psychological traits display similar ontogenies and hierarchical structures. Combined with data from comparative studies, such similarities provide evidence that those psychological traits are homologies. Second, theoretical ideas concerning behavioral homology suggest new ways to address standing questions about psychological traits. Third, ethology contains operational criteria for identifying behavioral homologies that can be used for identifying psychological homologies. Such criteria provide the program of studying psychological traits as homologies with a substantive empirical methodology.
the 1788 Mémoires sur les hôôpitaux de Paris, and the 1790 Nouveau plan de constitution pour la médecine en France, my paper will show that the reforms championed by Tenon and Vicq d’Azyr shed a new light on the changes undergone by medicine and hospitals at that time: the new “clinical gaze” entailed by the newly adopted architectural patterns of the hospital, the role devoted to teaching medicine “at the bedside”; the new epistemological status of diseases, varying from an Hippocratic approach, a nosological one and an anatomical-pathological method; the entanglement of political, religious, and scientific powers, and how the political one succeeded in limiting the others. Eventually, I will show that these authors invalidate the idea of a discontinuity between the pre- and post- revolutionary roles, functions and aims of hospitals, and rather seem to be a part of a long term trend resulting in the enforcement of state powers regarding hospitals and public health policy.
Session: II.6 Room: Peter Chalk 1.5 Time: Thursday 14:30-16:00
Antoine Ermakoff
REHSEIS, UMR7596 (CNRS-Université Paris 7 Denis Diderot), Paris, France
Session: V.6 Room: Peter Chalk 1.5 Time: Friday 11:30-13:00
Jacques Tenon, Felix Vicq d’Azyr, and the Hospital Reforms of the End of the Eighteenth Century in France
Most accounts on late XVIIIth century French medicine emphasize the links between important changes in medical knowledge and practice and the socio-political unrest of that period, which eventually resulted in the outburst of the French Revolution.
Toby Gelfand in Professionalizing modern medicine, Jacques Léonard in La Médecine entre les pouvoirs et les savoirs, or Michel Foucault in The Birth of the Clinic, all describe the last decade of the eighteenth century as a crucial period, and insist on the role of the hospital as a pivotal place for news ways of knowing and practicing medicine.
Indeed, the hospital is a place of emerge of new knowledge and practices as shown in Ackerknecht’s Medicine at the Paris Hospital. As Michel Foucault put it, it is also a place for the “clinical gaze” and its experimentations with the hierarchical structure, the wards and series of patients. It is a place of struggle of powers as well. Considering Christian commitment to relief of “the poor and the sick”, the growing state control of population, and the increasing number of medical staff in hospitals, it is not surprising that religious, political, social and medical powers closely intertwined in hospitals.
My aim is to investigate further the role of the hospital in the shaping of medicine at that time, by looking at the work of two physicians of that period: Jacques Tenon (1724-1816) and Félix Vicq d’Azyr (1748-1794). By comparing and contrasting their work, respectively
Arantza Etxeberria
University of the Basque Country, San Sebastian, Spain
Developmental Constraints and possible life
In reviewing Pere Alberch’s early contribution to evo- devo, the subject of developmental constraints is unavoidable. In the late 70’s and 80’s, work around this concept triggered significant concerns about development within post-synthetic evolutionary biology, especially as one of the key issues in Gould & Lewontin’s critique of adaptationism.
Pere Alberch’s work pursued this effort further by raising evidence for constraints in two different ways. One of them was theoretical work on the structure of morphospaces, comparing the existing phenotypes for a number of traits with the theoretically possible ones. The other was looking for evidences for internal construction rules of developmental processes.
Both of them can be understood as positing limitations for possible life, but they can also, especially the second, be seen in positive terms as rules guiding further evolution in the form of a non-random production of variants.
Some theorists have stressed the need to further develop this second understanding of developmental constraints, and to avoid using the notion of constraints for limitations of the action of natural selection produced by other means, for example, by historical contingencies.
In any domain, constraints are generically understood as additional rules acting on the set of natural laws or
norms of a given system. In this sense, developmental constraints would constitute forms of “weak” causality, at the same time influencing the evolutionary path and being, themselves, evolvable.
The two searches ‘collided’ in 1988 (Spangrude et al 1988, Lord & Dexter 1988). This collision, and the subsequent reorganization of the search for HSC, reveal social epistemic norms implicit in biomedical practice. I explicate two such norms, based on this episode, and examine their general epistemological significance.
Pere Alberch’s work pointed in this direction. In addition to his work on evolution, some of his writings inspired by developmental constraints to understand how art may represent possible life or to conjecture on the existence of inherent constraints on representation stemming from the complex dynamic nature of the human brain itself, confirm a view of constraints as forms of “weak” causality.
Session: II.1 Room: Newman B Time: Thursday 14:30-16:00
Session: XII.1 Room: Newman B Time: Sunday 9:00-10:30
Wilhelm Johansson: A rebel or a diehard?
Johannsen’s analysis of the notions of inheritance in the early twentieth century allowed him to reformulate the relationship between empirical, observed traits and inferred Mendelian Faktoren.
Like Mendel, Johannsen was interested in a numerical analysis of the problem of inheritance. However unlike Mendel who was interested in a bottom-up view of inheritance of specific, individual characters, Johannsen was primarily interested in the top-down perspective of the inheritance of the “type.” Pure lines were good empirical approximations to his concept of essential Linnaean types. Parameters of populations, being mixtures of types, provided only superficial appearance statistics. Therefore, phenotypic appearance variables should be conceived distinctly from immanent biological genotypic parameters.
Three major aspects of early genetics were reconceptualized with the introduction of the genotype- phenotype discrimination: That of the stability of Mendelian factors; that of continuous Darwinian evolution; and that of Weismann’s preformationist development.
It is, however, a key paradox that by the very elimination of the unit character from genetic parlance and the introduction of the gene, Johannsen provided a new, genetic framework for Galton’s posing Nature versus Nurture, thus inadvertently spawning a conceptual, rather than merely a heuristic reductionist “genocentricity.”
Raphael Falk
The Hebrew University of Jerusalem, Jerusalem, Israel
Melinda Fagan
Indiana University, Bloomington, Indiana, United States
Stems and standards: social mechanisms for managing complexity in immunology
I examine the social mechanisms by which complexity is managed in scientific inquiry. Many lines of scientific inquiry aim at knowledge of complex things. I focus on immunology, which aims at knowledge of the immune system and treatment of its disorders. This aim can be fruitfully conceived as management (regulation, control) of the immune system’s notorious complexity. This conception of the aim of immunology foregrounds the activity of contemporary biomedical practice, in both its epistemic and therapeutic aspects.
On this view, the significance of blood stem cell research for immunology is clear. Hematopoietic (blood) stem cells (HSC), localized in bone marrow, divide to yield progeny capable of differentiating into all the diverse cells of the immune system: B and T lymphocytes, natural killer cells, macrophages, granulocytes, erythrocytes, etc. HSC are, literally, the starting point for explaining and understanding cell- mediated aspects of immunity. Isolation and characterization of HSC is thus a means of managing the complexity of the immune system and its pathological variations.
Blood cell development (hematopoiesis) being a complex process, isolating and characterizing HSC is itself an attempt to manage complexity. I contrast two social arrangements aimed at such management: (1) ‘division of cognitive labor’ among various laboratory groups, and (2) ‘construction by collaboration’ of a mechanism for isolating and characterizing HSC. The search for HSC by hematologists and radiobiologists (1963 to 1988) proceeded by means of the first, the Weissman group’s search for HSC (1968 to 1988) by means of the second.
Session: XIII.11 Room: Peter Chalk 2.1 Time: Sunday 11:00-12:30
Jamie Feldman
Florida State University, Tallahassee, Florida, United States
The Primacy of the Heart in Aristotle’s Biology and Psychology
James Lennox, in his book on Aristotle’s philosophy of
biology, argued that there are two reasons a philosopher and a historian of biology would want to study Aristotle. They are that Aristotle essentially created both the science and philosophy of biology, and that his influence on the history of biology pervaded through the 19th century. He says that Aristotle created the very idea of a general scientific investigation of living things, and that these animal studies profoundly influenced the origins of modern biology from medieval and Renaissance Europe. Furthermore, Aristotle’s philosophical understanding of nature in terms of substance, matter, form, essence, definition, division, explanation, and teleology, which shaped and formed his way of doing biology, is an interesting study for philosophers to understand for its affinities, as well as its differences, to our modern philosophy of biology.
the RNA is transcribed into DNA and is integrated into the cell’s genome. These viral “DNA genes” were to account for the permanent changes of morphological characteristics and physiological behaviour of the cells and also for continued virus production. In addition, Temin provided experimental evidence in support of his hypothesis using new and innovative techniques.
Most historical accounts, written by fellow-scientists and historians of science, assume wide rejection of the hypothesis by the scientific community at large and point to its apparent contradiction with the “central dogma of molecular biology”. Furthermore, these accounts allude, only in passing, to the weaknesses of Temin’s experimental evidence and to his standing in the academic world.
In this paper I would like to offer an account of the provirus hypothesis’s reception from the perspective of researchers working in Temin’s proximate work domain - animal virology and cancer research (tumour viruses) - during the early 1960’s. Concomitantly, its reception will be considered in light of the new biological knowledge just being created and assimilated, for example the discovery of DNA polymerase, the elucidation of the concept of mRNA and ongoing investigations of RNA virus replication as well as the employment of new techniques such as DNA/DNA and DNA/RNA hybridization. At this point, the science of molecular genetics was still in the making, a flux of ideas, discoveries and techniques yet to be “black- boxed”, thus permitting an epistemological flexibility regarding a molecular mechanism which appeared to contradict the “central dogma”.
Room: Peter Chalk 1.5 Time: Thursday 14:30-16:00
Since Aristotle constructed his philosophy of biology from a position of hylomorphism, that is the conjunction of matter and form, Aristotle’s biology was, in effect, a study of the soul. This is because, for Aristotle, the soul is the form of the body, and the body is the matter of the soul. To make this clearer, it should be understood that for Aristotle there were four classes of causes: 1. the material cause, which is the matter of the thing, 2. the law according to which it has grown or developed, the form or formal cause, 3. the agent with whose initial impulse the development began, the efficient cause, and 4. the completed result of the whole process, the final cause. Therefore, the body is the material, the instrument of the soul, which is the principle of organic life. The organ that Aristotle assigned the primacy of being the seat of the nutritive soul, as well as that of locomotion and sensation, was the heart; as such it is the efficient cause, and the final cause is the animal’s life. In opposition to earlier writers, such as Plato, Homer, and some of the Hippocratic authors, Aristotle maintains by argumentation and observation that the heart, and not the brain, is the seat of the soul.
Session: II.6
Sarah Fisk
Session: VII.10 Room: Peter Chalk 2.3 Time: Friday 16:30-18:00
Florida State University, Tallahassee, Florida, United States
Nineteenth Century Studies of Hysteria And The Work Of Jean-Martin Charcot
The word hysteria derives from the Greek word, hystera, and refers to the condition in which it was believed that a woman’s uterus floats throughout the body seeking children, thereby causing the state known as hysteria. This condition was thought to have caused severe emotional disturbances in women. In the 19th century, a French neurologist by the name of Jean- Martin Charcot made a medical study of the condition known as hysteria in a large group of hospital patients. He was particularly interested in the underlying neuro- physiological causes of the condition and how these causes were, in turn, brought about. His main goal was to create a documented pathology and treatment
Susie Fisher
The Open University of Israel, Raanana, Israel
Not Beyond Reasonable Doubt –
A Re-examination of Howard Temin’s Provirus Hypothesis
During the early 1960s’ H. M. Temin (1934-1994) began advocating a hypothesis intended to support the idea that animal viruses play an etiological role in cancer. Temin’s ‘provirus hypothesis’ offered an explanation based on a molecular mechanism for heritable infection and transformation of chick embryo cells by Rous sarcoma virus. According to this hypothesis, following the introduction of the virus’s RNA into the host cell,
method for the condition. He believed that all patients suffering bouts of hysteria exhibited certain universal symptoms, that once identified, could be universalized to apply to all hysteria patients. This identification, of course, would greatly aid in diagnosis and consequently, treatment. Charcot’s work had a profoundly influential effect on Freud’s later work on hysteria.
The focus of this paper will be the 19th century study of hysteria, particularly Charcot’s work and the work of those who influenced him. I will also examine the sociological factors affecting both the study of hysteria as a medical condition and the various views of the condition itself. Finally, I will explore the various 19th century treatments of hysteria and conclude with a discussion of the alleged wane and disappearance of hysteria as a diagnosis in the 20th century.
The main objective of the paper will then be to defend a general account of the role of simplicity in science, which addresses the problems raised, and draw out the implications of this account for the role of simplicity in animal psychology. The account I will defend is the “Deflationary Account of Simplicity” (Fitzpatrick, 2006), which builds on the work of Elliot Sober (1988) and others. According to this account simplicity per se is not appropriate grounds for theory choice. When simplicity matters in theory choice—which is not always the case— it matters because it is a stand-in for other properties of theories whose virtue is determined by the local background theoretical context. I will argue that this account allows us to make good sense of the plausibility of some appeals to simplicity in the debate about primate ToM and the implausibility of others. I will conclude with some brief remarks about where I think this leaves the question of whether Chimpanzees have a ToM.
Session: VII.7 Room: Peter Chalk 1.6 Time: Friday 16:30-18:00
Simon Fitzpatrick
Session: X.6 Room: Peter Chalk 1.5 Time: Saturday 14:00-15:30
Univeristy of Sheffield, Sheffield, United Kingdom
Patrick Forber
Tufts University, Medford, MA, United States
Simplicity and Methodology in Animal Psychology: A Case Study
Arguments appealing to simplicity as a consideration for deciding between rival theories are common in many areas of science, but they have been especially influential in animal psychology. In the absence of sufficient empirical evidence for their chosen theory about the causes of animal behaviour, animal psychologists will often argue that it should be favoured to alternative theories because it is simpler. In this paper I will discuss the important philosophical problems that such arguments raise, both from the perspective of general concerns about simplicity in the philosophy of science and concerns that are more specific to the study of animal minds. I will do this with the help of a concrete case study from modern animal psychology, where appeals to simplicity have been both very influential and also controversial: the debate over whether non-human primates reason about the mental states of others — whether they have a “Theory of Mind” (ToM).
An interesting feature of this debate is that both sides have sought to enlist simplicity as a consideration that supposedly tells in favour of their respective views (e.g. Whiten, 1996; Povinelli and Vonk 2003; Tomasello and Call, 2006). I will take a closer look at the logic of the various simplicity arguments that have been proposed in this debate and highlight the problems they face. Aside from problems of justification (what justification is there for preferring simpler psychological theories in animal psychology?) I will focus in particular on the problems raised by the multiplicity of conflicting ways in which psychological theories can be assessed for their simplicity.
What Good Are Optimality Models Anyway?
Maynard Smith championed the strategy of using optimality and game theoretic models to investigate evolution. Yet the use of optimality models in evolutionary biology remains contentious, in large part due to the impact of Gould and Lewontin’s methodological critique in their “Spandrels” paper. Optimality models overlook genetic architecture and other non-adaptive factors that can confound the power of selection to “optimize” particular traits. Given these lasting and foundational problems, do optimality models have a place in the explanatory tool box of evolutionary biology? I will argue that they do because optimality modeling, as a methodological strategy, provides crucial ecological information for evolutionary explanations. Ecological information is important to an evolutionary explanation because it helps fill in an important the causal gap: why specific phenotypic variation has consequences for evolutionary fitness. Filling the causal gap is crucial to overcome a pervasive problem of evidence, the problem of contrast failure. I will use an example from to illustrate the problem of contrast failure and show how ecological information helps overcome this problem. Optimality models prove their worth by making contact with this ecological information. And using the methodological strategy of optimality modeling does not commit us to the view that evolution is necessarily an optimization process.
REFERENCES
Sober, E. (2005). Is Drift a Serious Alternative to Natural Selection as an Explanation of Complex Adaptive Traits? Philosophy, Biology and Life. A. O’Hear. Cambridge, Cambridge University Press.
Session: I.6 Room: Peter Chalk 1.5 Time: Thursday 11:30-13:00
quite as obvious as once it was, however. Microbiological research shows that information-processing descriptions apply neatly to the behaviour of unicellular organisms. Comparative genomics also raises questions regarding the phylogenetic origins of nervous systems. Asserting the fitness advantage of nervous systems on the sole basis of information-processing capacities, therefore, fails to satisfactorily explain why nervous systems arose in the first place. Important pieces of the puzzle are still missing. The paper develops the tentative claim that nervous systems evolved as a solution to motility and other existential problems faced by multicellular organisms, which act as a single unit in a four- dimensional space, as they increase in size. As the scale of organisms increases, the challenges associated with physiological change do too—exponentially. The difference in degree of difficulty between building an arch out of toy blocks and building Stonehenge is so obvious it’s trivial, yet in the study of animal behaviour such differences are often overlooked. In order to solve the same functionally described problems—including the generation of adaptive behaviour—organisms at different scales require very different mechanical solutions. The paper argues that the primary task of a nervous system thus is the generation and coordination of spatiotemporal patterns across a bodily or muscular surface, patterns that are reflected in behaviour. The hypothesized role of nervous systems is clearest in animals with hydrostatic skeletons. Case studies of jet propulsion, peristaltic and undulatory locomotion illustrate that at the size-scale of nematodes, annelids and molluscs, nervous systems are required to generate and coordinate activity across extended bodies. At the same time, these case studies highlight the dynamic and distributed nature of nervous control against the image of decoupleable and centralized control that comes more naturally in the study of animals with hard skeletons and cephalized nervous systems. The paper argues that the sequential reflex organization usually taken as primary in the ‘obvious’ information- processing answer to the question of what nervous systems do is more likely a secondary development.
Denis Forest1,2
1) IHPST, Paris, France, 2) University Lyon III, Lyon, France
Causal role theory of functions and theoretical changes in neuroscience
Where does change come from ? How can we analyze it? I shall argue that both causal role theory of functions and mechanistic philosophy of scientific explanation can significantly enhance our understanding of the revision and extension of functional ascriptions in neuroscience.
First, the distinction between the proximal description of a component and the distal description of its use within a given system (Craver, 2001) helps us to understand how the (well-grounded) instantiation of a functional role is still compatible with further experimentation leading to quite different views on the corresponding functional architecture. I’ll develop this point on the example of the history of research on proprioception, muscle spindles physiology, and the shift from an explanation of their sensory power to an explanation of their involvement in predictive control and simulation.
Second, the very nature of mechanistic explanation makes the identification of component operations of a system (Bechtel, 2002) highly sensitive to the revision of our knowledge concerning the output of the system itself. My example will be contemporary research on multisensory perception and the search for new integrative mechanisms like cross-modal interactions and vertical feedback from heteromodal to unimodal cortex (Macaluso and Driver, 2005), which lead to a reappraisal of the modular paradigm in cognitive neuroscience.
Finally, I’ll try to reconcile the causal role notion of function and its so-called non-normative nature (Davies, 2001), with the pervasive influence of pathological cases on our understanding of the functioning of the brain.
Session: XII.10 Room: Peter Chalk 2.3 Time: Sunday 9:00-10:30
Session: III.11 Room: Peter Chalk 2.1 Time: Thursday 16:30-18:00
Daan Franken
Robert Gadda
University of Groningen, Groningen, Netherlands
Reed College, Portland, OR, United States
What do nervous systems do?
Within the cognitive sciences, the answer to this question seems obvious enough: Nervous systems are information- processing devices that pick up environmental information and use this information to control behaviour. In the classical information-processing paradigm, the nervous system is the quintessential basis for a cognitive system. The ‘obvious answer’ is no longer
Developmental Systems and Etiological Theories of Teleology
Etiological (historically oriented) theories of teleology argue that functions of parts of organisms critically depend on natural selection. The function of a part is whatever it has been selected to do over many generations. For example, hearts in animals were selected for their ability to pump blood, and therefore
this is their function. This theory is usually tested by applying it to obviously functional examples such as hearts and eyes. Attempts to apply the theory to other biological entities, like individual cells in an organism, reveal some problems. Cancerous cells arise by somatic evolution among competing cells in an organism. This challenges etiological theories because the functions of these cells rapidly change from serving a body function to simply proliferating selfishly. In The Evolution of Individuality Leo Buss has dramatically shown that these evolutionary dynamics are part of the developmental system itself. Evolution has not produced organisms that are just programmed to make themselves (as genetic determinism implies), but has evolved multiple-level selective scenarios that reliably result in the “selected” outcome of functionally integrated, hierarchically organized organisms. This developmental systems perspective provides a new way to think about teleology.
To understand cancer and other complicated biological phenomena, I define the “default function” of a biological entity. The default function is simply the function of proliferating and spreading. Buss’ work shows that the success of multicellular organisms is due to their ability to harness the default function of groups of individual cells for the good of the whole body. In the case of cancer the default function of a lower level of selection subverts that of a higher. This reveals that applying the etiological analysis to individual cells of an organism does not make sense; one merely finds the uninteresting default function. A biological part must consist of a group of cells that emergently perform some macroscopic function in the context of an organism.
Therefore, true functions only happen between levels of selection – the clearest case being that between the cellular and organism levels of selection. This also seems to generalize to parts of cells (organelles), as well as perhaps social structures that undergo cultural evolution. This paper will show that an understanding of developmental systems does not disprove the etiological theory, but shows that functions are crucially context dependent and that a part of a system must be defined in terms of levels of selection. It is time that theoretical biology has an impact on this traditionally philosophical problem.
Session: III.5 Room: Peter Chalk 1.3 Time: Thursday 16:30-18:00
Lisa Gannett
Saint Mary’s University, Halifax, Nova Scotia, Canada
From Flies to Humans: The Genetic Basis of Group Identity
Individuals are being identified as belonging to particular groups (racial, ethnic, national, geographical, etc.) on the basis of their genetic makeup in a number of different research – and now even commercial – contexts. Multi- locus genotyping is understood to produce clusters of individuals who share “genetic ancestry” or “ancestral geographic origin” (e.g. Rosenberg et al. 2002). $99.95 plus shipping and handling buys a Participation Kit from the Genographic Project for the determination of “deep ancestry” based on mtDNA haplotype. DNAPrint genomics uses autosomal DNA and “biogeographical ancestry” to serve a number of client groups: by “uncovering” the “personal anthropology” of the genealogically curious, by “proving” American Indian heritage, and by “deducing” the “race proportions” of crime suspects based on biological materials left at the scene. This paper examines various theoretical presuppositions underlying the population genetic foundations of such claims. Some of these presuppositions are implicated in assigning flies no less than humans to groups. Others are implicated in assigning humans though not flies to groups given the social construction of race and the centrality of social group affiliation throughout our history as a species. Historical background for this discussion is provided by research conducted in the mid-20th century on variability in blood group frequencies among human groups and chromosomal type frequencies among natural populations of D. pseudoobscura.
Session: XII.8 Room: Peter Chalk 2.6 Time: Sunday 9:00-10:30
Yin Gao1, William Herfel2
1) The University of Newcastle, Newcastle, Australia, 2) Western Sydney University, Sydney, Australia
Post-classical Ecology: On the Emerging Dynamic Perspective from Self-Organizing Complex Adaptive Systems
Ecosystems ecology, especially under the influence of the energetic ecosystems models of the Odum brothers, has been guided by mechanistic and equilibrium assumptions about ecosystems dynamics. The assumptions are seldom articulated, except when they are being critiqued. Nevertheless the assumptions are implicit both in the models proposed in both theoretical and empirical work as well as in the definition of
research problems. We term such a system of assumptions a “dynamic perspective,” a concept similar to Stephen Toulmin’s “ideals of natural order.” Two aspects of the dynamic perspective of classical ecosystems ecology are as follows: the expectation by default of equilibrium, and the corresponding requirement of explanation for departures from equilibrium; the expectation that patterns of energy flow through an ecosystem will be mechanistic, and the corresponding requirement of explanation for novel behaviour. Recent work from the self-organising and complex adaptive systems perspective offers a stark contrast to the mechanistic assumptions of classical ecosystems ecology. We will articulate elements of this new dynamic perspective that is emerging from the application of self-organizing and complex adaptive systems work in ecology. A significant part of this emerging dynamic perspective is a focus of the details of the dynamics of the individual organisms inhabiting the ecosystem. Here we will contrast the work of Claudia Pahl-Wostl with that of Robert Ulanowicz. The paper will conclude by looking at the pragmatic value of self- organizing complex adaptive systems models using work by Stephen Lansing and James Kremer on the ecology of Balinese subak agriculture as a case study.
processes in terms of necessity and possibility. This assists us (1) to suitably identify symmetric, reflexive and transitive relational features of organisms and mechanisms in many possibly different temporal lines, frames of reference, and population distributions; and (2) to consistently use probability language for describing kinds per se in time. I thus suggest we explore the ways in which a possible world semantics approach - the model theory of counterfactual entities – can make sense of the logic of time and its meaning for biological identity. As a case study, I address the universal molecular clock hypothesis as proposed by chronobiology, and its bearings on the identity of organisms, populations and generations in the light of modal relationships. Accordingly, I contrast research on the synchronization of natural oscillators and the development and evolution of modulatory mechanisms, with two alternative approaches - a nomological counterfactual account and a local context-dependent counterfactual account of open systems - to devise the many possibilities and eventual compossibilities for drawing world-lines and mapping biological identity in time.
Session: XII.5 Room: Peter Chalk 1.3 Time: Sunday 9:00-10:30
UNAM, Mexico City, Mexico
Luciana Garbayo
Boston University, Boston, MA, United States
Modularity thinking as a way of managing complexity in developmental biology
In 1962, Herbert Simon described the increasing popularity of the study of systems in science and engineering as a response “to a pressing need for synthesizing and analyzing complexity,” rather than to “any large development of a body of knowledge and technique in dealing with complexity” (p.229).
Despite the explosion of scientific research programs that build on myriad notions of complexity, the body of knowledge that examines how complexity is dealt with has been built upon a monolithic framework that I characterize as the problem-solving approach. I identify the origins of this approach in Simon’s work, and track down its influence in the current discussion of modularity in developmental biology. More specifically, I argue that the organized effort to seek modularity in developing systems can be understood as a managerial strategy (rather than as a problem-solving strategy) that has implications at the disciplinary and organizational levels. The Pax/Six/Eya/Dach gene network will serve as an example of how complexity is thus managed.
Time in Biology: an analytic critique and a possible world semantics approach to the temporal structure of living being
In which ways does time play a role in biology? How should eventhood be embodied in theories and experiments describing mechanisms? Does chronobiology offer special insight into the ontological problem of time in biology? To begin to address these sorts of questions, this paper discusses traditional arguments for modeling continuous and discrete temporal structures in biology, and inconsistencies in our tensed and tenseless uses of time. I address some of these uses critically and suggest that the way we ascribe rigidity to natural kinds in our language renders time an ‘externalized’ dimension. This helps explain why we remain blinded by our traditional views on permanence and change. To address this problem, I propose that we allow room for flexibility concerning the relative identity of kinds over time, especially in mechanistic biology.
To this end, I interpret Griffith’s updated version of Boyd’s homeostatic property clusters as modal properties for characterizing causal relata and biological
Session: XII.1 Room: Newman B Time: Sunday 9:00-10:30
Vivette García
Session: VI.6 Room: Peter Chalk 1.5 Time: Friday 14:30-16:00
Session: IX.6 Room: Peter Chalk 1.5 Time: Saturday 11:00-12:30
Miguel Garcia-Sancho
Bernd Gausemeier
Centre for the History of Science, Imperial College, London, United Kingdom
Max Planck Institute for the History of Science, Berlin, Germany
Creating a genetic language: DNA sequencing and the first modern biological databases (1965-1985)
My paper will explore the effects of the first DNA sequence databases in the ontological status of this molecule and the practice of sequencing. I will, concretely, focus on the effort to create a central European sequence repository, developed at the European Molecular Biology Laboratory (EMBL) in Heidelberg during the 1980s. By analysing a series of interviews with the database creators, their published papers and personal archives, I will discuss how the European collection required a new sort of professional –the curator- external to biological research and who adapted the database technology to the specific features of DNA.
The emergence of the curator distinguished the DNA sequence database from other modern biological repositories, as the crystallographic and protein sequence banks. These collections, initiated during the second half of the 60s, were maintained by traditional biologists who combined research with database work. In comparing such databases with older natural history repositories and the new DNA sequence banks, I will argue that the latter completed a tendency towards abstraction in the life sciences collections: the stored representations were increasingly unrelated –both as icons and symbols- to the original specimens.
This growing abstraction was mainly due to a series of software Graham Cameron and Greg Hamm –creators of the EMBL database- wrote during the early 80s to validate the stored sequence data. The programs were based on the then emergent spell checking tools of the text processors and considered the particular features of DNA as a language –e.g. the necessity of a specific sequence before and after a gene. Hamm and Cameron also adapted the database technology –used in public administration and various private companies since the 60s- to the DNA sequences and created new entry structures able to extract features from the stored information.
Leviathan and the Ultracentrifuge:
Politics, Technology and the Life Sciences in National Socialist Germany
Historians are increasingly less inclined to regard National Socialism as a regime basically hostile to science. The massive mobilization of research in the agricultural area, for the chemical industry and for the military are now very well documented. Also genetics, owing to the ideological primacy of eugenics and biologism, was a science that thrived particularly well under Nazism. But what about scientific fields that were not so apparently linked to industrial, military and political applications? Was there room for innovative “basic research” in this totalitarian regime?
In my talk, I will deal with an aspect of research at one of the most distinguished German institutes in the field of the life sciences. In 1938, the Kaiser Wilhelm Institute for Biochemistry adapted the first German ultracentrifuge. What kind of research was it used for, and how was it financed? By answering these questions, I will try to throw light both on the structures of science policy in the “Third Reich” and on the state of German biological/biochemical research in the 1930s. In somewhat loose dependence on the classical Shapin/Schaffer book, I want to use a research technology to develop a symmetric picture of science and politics. In other words, it is my intention to point up the dynamics of scientific practices in a regime that is still regarded as a totalitarian Leviathan that strictly directed science according to its ideological guidelines.
Ultracentrifuge technology came to the KWI for Biochemistry in the context of its virus research project. Like the American and British pioneers in this field, the German researchers regarded the purification of tobacco mosaic virus as the starting point for a new way to understand the chemical structure of the gene. Further, the project was seen as a vanguard venture creating a completely new form of interdisciplinary “basic research”, involving botanists, zoologists and biochemists. In fact, even during the war, the project never drifted into applied virology. Yet its basic technology, the ultracentrifuge, generated strong links with powerful institutions. The apparatus was financed by I.G. Farben industries and largely developed in cooperation with researchers of the company. One ultracentrifuge of the project was constantly spinning in an I.G.F. laboratory, where it was also used by industrial chemists. For the KWI for Biochemistry, the new technology, first calibrated by using viruses,
allowed to take up new problems of protein chemistry. During the war, the biochemists transferred their expert knowledge to military-oriented research projects.
The ultracentrifuge, thus, created a sociotechnical network that not only comprised different scientific fields, but also connected various components of the National Socialist war economy. While seemingly only a technical aspect in a research project dealing with the „„basic“ biological question of gene structure, it does in fact offer a starting point to understand the modernization and mobilization of science in
Time: Saturday 14:00-15:30
How far is history of science relevant for philosophy of science? The case of the gene
In several papers*, Richard Burian has defended that there are two main views of conceptual change in science. One is the “discontinutiy view” (Feyerabend, Kuhn), which assumes radical incommensurability of the successive versions of a given theoretical term (according to the succesive theories within which this concept is defined). Another is the “continuity view”, which Burian supports, in spite of the fact that he does not admit the standard philosophical account of this view (based on theory reduction). Resting on the case of the actual history of the gene concept, Burian supports that “it is possible for scientists to exercise strong controls to ensure that they are referring to the same entity or entities despite large differences in viewpoint, terminology, concepts, and theoretical commitments”.
I conclude that I agree with Burian’s subtle “continuity” view of conceptual change in the case of the gene, but with different arguments.
*Esp. “On Conceptual Change in Biology: The Case of the Gene”, in Depew and Weber (eds.), Evolution at a Crossroads: The New Biology and the New Philosophy of Science, Cambridge, MA: MIT Press, 1985, pp. 21-42.
Nazi Germany.
Tremont Research Institute, San Francisco, CA, United States
Session: II.10 Room: Peter Chalk 2.3 Time: Thursday 14:30-16:00
Elihu M. Gerson
Session: X.1 Room: Newman B
Jean Gayon
Varieties of intersection: Specialties and collaboration networks.
The external genital anatomy of female Spotted Hyenas (Crocuta crocuta) mimics that of males. Moreover, females are bigger and more aggressive than males. Accounting for these facts has given rise to a collaboration network among researchers from over a dozen specialties. This network, in turn, participates in two junctures: comparative developmental reproductive endocrinology and the evolutionary ecology of behavior and social organization. At the same time, the network does not participate in the world of evo-devo research, even though the problems it addresses are at the heart of evo-devo concerns. This pattern of relationships among projects and specialties raises a number of questions about the organization of research. I suggest that we distinguish between disciplines and specialties on the one hand, and networks of collaboration and engagement on the other.
Université Paris I (Sorboune), Paris, France
After summarizing Burian’s argument, I will try and provide a general evaluation of the paradox resulting from the permanence of the term “gene” in spite of the the non-coincidence of the descriptive contents of the various definitions of the gene.
First, I recall the structure of the classical argument showing why the concept of the gene in “classical genetics” cannot be unambiguously be correlated with the theoretical concepts of molecular genetics. Secondly, I point out the inability of today’s biology to answer simple questions as simple as: Where are the genes? When do they exist? What are they? How many genes are there in a given organism? Thirdly, I provide an interpretation of why biologists continue to use the word “gene”. Three complementary explanation are proposed: scientific communication, economical stakes, and struggle for scientific authority among biological disciplines.
Peter Gildenhuys
Session: III.9 Room: Peter Chalk 2.5 Time: Thursday 16:30-18:00
University of Pittsburgh, Pittsburgh, United States
Inheritance in Griffiths and Gray’s Developmental Systems Theory
What follows is a criticism of the conception of inheritance found in Griffiths and Gray’s writings on Developmental Systems Theory (DST). Inheritance is used in one of two delineations that Griffiths and Gray offer of the developmental system and it is also used in their statement of the requirements for natural selection. Griffiths and Gray’s conception of heredity is unsuited to both tasks, though for different reasons. Inheritance cannot be used to delineate developmental systems in a way that avoids Kim Sterelny’s “Elvis Presley” objection, the objection that developmental systems are unmanageably inclusive and holistic. That
is not a serious problem: Griffiths and Gray have an alternative way of delineating developmental systems, specifically by way of the concept of adaptive historical explanation, with which I find no fault. Inheritance is an inadequate concept to use in setting the requirements for selection because it leaves those requirements subject to counter-example provided expanded inheritance is endorsed, as it is by advocates of DST. I suggest how to repair this last difficulty while maintaining a commitment to expanded inheritance. Dropping inheritance as a theoretical term in Griffiths and Gray’s version of DST has little effect on the rest of the salient features of that approach to the study of biology.
Session: VII.4 Room: Peter Chalk 1.1 Time: Friday 16:30-18:00
Tel Aviv University, Tel Aviv, Israel
When Is ‘Race’ A Race? -The Use of the Race Category in Genetics, Epidemiology and Medicine in Recent Decades
My paper deals with what has been called the contemporary resurgence of the category of race in western genetics, epidemiology and medicine. I ask how is “race” used in genetic discourse and in the medical and epidemiological ones. Is “race” employed as an analytic category, a classificatory one or at times even as a substantive one? To provide preliminary answers to these questions I have approached the issue from a historical perspective, and have chosen as source materials a number of respected and influential journals in these fields, primarily from Great Britain and the USA.
Based upon my findings I shall argue that in the second half of the 20th century the use of the race category in genetic, medical and epidemiological research had been one of seeming disappearance and renewed resurgence. My work traces the fluctuating yet continuous use of ‘race’ during the time span 1946-2003, and delineates three crucial phases. It compares this use with the corpus of works by Israeli authors in these journals and their use of ‘race’ and of the race category.It concludes with some reflection upon the findings.
Session: IX.2 Room: Newman C Time: Saturday 11:00-12:30
Shane Glackin
Leeds University, Leeds, United States
Session: I.5 Room: Peter Chalk 1.3 Time: Thursday 9:30-11:00
Facts and values against the reduction of ethics to biology
It has been argued by some ethicists argue that science – biological science in particular - and its findings have important ramifications for ethical theory, and even vice versa; the possible “acceptable” findings of science are restricted by what is ethically permissible. Much impetus has been given to this doctrine by reductionist currents in biological theory; according to figures such as Michael Ruse, Peter Railton, and Alan Gibbard, ethical concepts are held to be explicable as a species of biological fact.
This paper argues that while biological considerations may well explain why “moral” behaviour is prevalent in the human species, they omit a crucial part of the picture; what is moral about such behaviour. Using arguments parallel to many of those which are standardly used to oppose mind-brain reduction, it argues that the facts of moral experience necessarily escape reduction to genetic imperatives, or other purely naturalistic modes of explanation. Reduction can provide us with moral ethology, but not with morality proper.
Since a strict separation between ethical facts and scientific values is thereby posited, the paper creates an interesting tension within the session; the strength of a scientific position is held to be independent of any moral or political implications that may be associated with it. This creates an interesting problem for the anti- reductionists whose political and personal motivations are examined in other papers; anti-reductionism itself may cast doubt upon their reasons for holding it.
Snait B. Gissis
Stuart Glennan
Butler University, Indianapolis, IN, United States
Causal Productivity, Causal Relevance and the Nature of Selection
Recent papers by a number of philosophers — among them Walsh, Lewens and Ariew [2004], Millstein [2006] and Shapiro and Sober [forthcoming] — have been concerned recently with the question of whether natural selection is a causal process, and if it is, whether the causes of selection are properties of individuals or properties of populations. I shall argue that much of the confusion in this debate arises because of a failure to distinguish between causal productivity and causal
relevance. Causal productivity is a property of certain of processes, while causal relevance is a property of certain properties. I shall argue that the productive character of natural selection derives from the aggregation of individual processes in which organisms live, reproduce and die. At the same time, a causal explanation of the distribution of traits will necessarily appeal both to causally relevant properties of individuals and to causally relevant properties that exist only at the level of the population.
a unit or level of symbiotic evolution. Hence, the classic units and levels of selection debate can be expanded to include symbiosis and this in turn has major epistemological consequences.
Epistemologically, the following questions are in need of careful philosophical investigations.
(1) The study of symbiosis and symbiogenesis demonstrates that an organism is made up of phylogenetically acquired gene sets (e.g. mitochondrial DNA that evolved out of symbiogenetically acquired parracocci) and/or ontogenetically acquired gene sets (e.g. parasites or viruses). Therefore a pertinent question is whether or not a distinction can be made between primary and secondary partnerships. Or can these relationships be structured hierarchically?
(2) Another question relates to the number of symbionts and the pace of their evolution. Are different (endo)symbiotic partners independent units or levels of evolution or is the organism as a whole the sole unit of evolution? If there are different units and levels distinguishable within symbiotic evolution, do these different units evolve at different rates?
(3) And need all units that make up a symbiotic organism be subjected to a form of symbiotic evolution or is there still room left for other evolutionary mechanisms? That is, does a symbiotic explanation of evolutionary events automatically exclude other evolutionary mechanisms?
(4) A last question that will be addressed is whether symbiosis and symbiogenesis is only applicable to events that take place in the microcosm.
Previous work of mine has shown that symbiosis and symbiogenesis can also be extended to extra-biological fields such as anthropology and evolutionary linguistics, where it can play a universal role analogical to universal Darwinism (Dawkins) and universal selectionism (Cziko).
Session: II.2 Room: Newman C Time: Thursday 14:30-16:00
Peter Godfrey-Smith
Harvard Univ., Cambridge MA, United States
Darwinian Populations and Transitions in Individuality
Many major transitions in evolution involve the appearence of new higher-level ‘individuals,’ emerging from cooperating collectives of lower-level entities. Examples include the evolution of the eukaryotic cell, and the evolution of multicellularity. It is often unclear when, in this process, a new higher-level individual should be recognized as real. Some discussions of the case of multicellularity emphasize reproductive specialization, for example; others regard this criterion as too restrictive. I approach these topics via a general treatment of ‘Darwinian populations,’ combining traditional summaries of the evolutionary process with input from recent work in the philosophy of science. Michod’s models of the evolution of multicellularity are used as a case study.
Time: Friday 16:30-18:00
Ontological and epistemological implications of symbiosis and symbiogenesis
The endosymbiotic theory of evolution has major ontological and epistemological consequences.
Ontologically, symbiosis goes right to the heart of different species concepts and organismal definitions. That is, since all organisms are understood as primarily chimeras, it raises the question of how to properly define a biological individual. At the species level, the endosymbiotic species concept, developed by Margulis and Sagan (2002) also has ontological consequences for how to define a species. Evidently this ontological discussion has consequences for what can be defined as
Session: VII.11 Room: Peter Chalk 2.1
Nathalie Gontier
Centre for Logic and Philosophy of Science, Vrije Universiteit Brussel, Brussel, Belgium
Session: X.7 Room: Peter Chalk 1.6 Time: Saturday 14:00-15:30
Fabricio Gonzalez-Soriano1,2
1) National University of Mexico, Mexico City, Mexico, 2) University of Papaloapan, Loma Bonita, Oaxaca, Mexico
Preventive Discourse On Pathological Heredity: Materialization Of Medical Power In The Mexican Civil Law, 1870 To 1930
Mexico established Premarital Medical Examination (PME) as a requisite in civil law, first with an optional status in 1917 (Familiar Relationships’ Law), and then made it compulsory in 1932 (Civil Code for the Federal
District and Territories). This is very early in the twentieth century in contrast with countries like France and the contemporary movement toward eugenic regulation of marriage in some states of the USA. In Mexico, the goal was avoid mating between persons afflicted with venereal, hereditary and mental diseases mainly.
modularity condition is violated, we are plunged into causal skepticism: Either the modularity condition does hold of these neural networks, or we do not have epistemic access to the causal structure of the universe. In this paper I demonstrate that violations of the CMC do not entail violations of the modularity condition. Thus, such neural networks do not stand as counter- examples to the modularity condition.
This work tracks down the inspiration behind the instauration of PME in Mexican Civil Law. It shows the event as the legal outcome of two different streams of discussion and efforts in Mexican medicine, both of which started off concerned with pathological heredity. One of these began in the last decades of the 19th century linked to a “nihilist” but prophylactic approach to hereditary diseases (portraying them as incurable but preventable pathologies) that shaped the discussion around two opposite thesis about the morbidness of consanguinity: a “hard” thesis and a “soft” thesis. These discussions by Mexican medics began to shake the civil legislation around kinship and marriage. The second dispute in the first decade of the 20th century stemming from a concern about prevention of infectious or hereditary venereal diseases (mainly syphilis) and was moved both in the national and regional scenes by strong moralizing and prophylactic efforts of the medical societies, as happened in France and the USA. This research addresses the question about which specific historical characters, physicians or jurists promote and set up in the law the premarital medical examination.
Session: VIII.7 Room: Peter Chalk 1.6 Time: Saturday 9:00-10:30
Session: VI.2 Room: Newman C Time: Friday 14:30-16:00
Experimentation and the Development of Lloyd Morgan’s Canon
Evolutionary concerns of the late 19th century gave rise to an interest in animal behaviour and mind that would ultimately produce a new discipline of comparative psychology. Mental evolution posed new problems involving the interpretation of animal behaviour. The solution to these problems required philosophical and methodological advancements that would enable scientists to state what the mental capacities of non- human animals are. Disputes arose over whether discussions of the subjective states of animals belonged to science.
George Romanes reported numerous anecdotes that often attributed mental states to animals. His method of attributing mental states to animals would proceed by way of analogy from his own mental states to those of the animals’. He would consider the observed behaviour of the animal and then consider how he would feel when behaving similarly; then he would form an analogy and attribute a mental state similar to his own to the animal in question.
Critics of Romanes believed that there was no rigorous way of attributing mental states to animals. Different interpreters would attribute different mental states to the same animal. In the 1880s, one of Romanes most influential critics was Conwy Lloyd Morgan. Lloyd Morgan used experimental evidence to prove several of Romanes’ anecdotes false. Lloyd Morgan believed that science should only concern itself with animal behaviours that could be objectively observed. However, by the 1890s Lloyd Morgan would come to accept and incorporate many of Romanes ideas, and he would develop methods for attributing subjective states to animals.
In this paper I focus on the role of experiment in the development of Lloyd Morgan’s thinking. I argue that in order to understand how Lloyd Morgan changed from one who thought that we could not know what the mental states of animals were to one who believed that
Don Goodman
Washington University in St. Louis, St. Louis, MO, United States
Woodward’s Modularity Condition, the Causal Markov Condition, and the Causal Structure of the Central Nervous System
If there is a causal structure to the central nervous system, and it is such that we can know it, there must be constraints that allow us to segregate causal relations from mere correlations. The causal Markov condition (CMC) and Woodward and Hausman’s modularity condition are two candidates for such constraints. Woodward and Hausman argue that the modularity condition implies the causal Markov condition. However, violations of the causal Markov condition seem to be widespread; examples can be found in the study of neural network connectivity. Neural networks can, at least in principle, violate the preconditions necessary for the CMC to hold. If this is true, then the modularity condition does not hold of such neural networks. However, I argue that if the
Grant Goodrich
Indiana University, Bloomington, United States
we could understand such mental states, we need to understand his use of experimentation. In 1894 Lloyd Morgan published An Introduction to Comparative Psychology which included his famous canon. The canon states when we are secure in attributing mental states to animals and is itself a call for experimentation.
philosophical side, we aim to refine and articulate Bedau’s theory of ‘weak emergence.’ After presenting a taxonomy of different emergence concepts, we argue that weak emergence is a particularly helpful way to understand the hierarchical nature of biology: it captures the ways in which higher-level traits depend on lower-level processes, while recognizing that emergent traits can nonetheless provide the basis for autonomous higher-level theories. A brief review of the biological literature suggests that geographic range size is weakly emergent in Bedau’s sense. If geographic range is emergent in this sense, it provides a basis for arguing that macroevolutionary phenomena cannot be fully explained by microevolutionary processes.
Session: V.11 Room: Peter Chalk 2.1 Time: Friday 11:30-13:00
Brian Goodwin
Schumacher College, Devon, United Kingdom
The Language of Living Processes
The complexity of the molecular networks underlying the regulation of gene activity in cell differentiation and morphogenesis has now reached bewildering proportions. It is generally assumed that within this complexity there is a deterministic pattern of interactions between molecules and with genes that will account for the robust, reliable processes of development at different levels that gives rise to the coherently- formed functional adult organism.
However, there is an alternative to this molecular determinism. This arises from the realisation that communication in regulatory processes shares basic features with languages, in particular a fundamental variability in the patterns of interaction between components, or ambiguity. The reason for this will be described in terms of a model of the evolution of language that can be applied to the evolution of communication networks in cells which explains a basic feature of languages related to word distributions, known as Zipf’s Law. Evidence from the distribution patterns of molecular components in regulatory networks consistent with this interpretation will be presented, and the reasons for ambiguity in connection with the reliability, adaptability, and context-sensitivity of languages as generators of meaning will be discussed.
Session: X.2 Room: Newman C Time: Saturday 14:00-15:30
Russell Gray
University of Auckland, Auckland, Australia
Development and cultural transmission of tool use in New Caledonian crows
New Caledonian crows have remarkable tool manufacturing abilities. They manufacture two very distinct types of foraging tool throughout their range: tools made from sticks and similar material and those made from the barbed leaves of Pandanus trees. Our work has shown that their tool manufacture has four features previously thought to be unique to hominids: the distinct shaping of tools, the crafting of hooks, “handedness”, and cumulative changes in tool design. In this talk I will discuss recent claims that their tool manufacture skills are based on an “inherited predisposition” in the light of our recent development studies. I will outline why the crows’ tool manufacture do not fit easily into traditional classifications of social learning such as imitation, emulation, stimulus enhancement and individual trial and error learning and discuss how high fidelity social transmission can occur without imitative learning.
Session: V.8 Room: Peter Chalk 2.6 Time: Friday 11:30-13:00
Todd Grantham1, Mark Bedau2
Session: XIII.1 Room: Newman B Time: Sunday 11:00-12:30
1) College of Charleston, Charleston, SC, United States, 2) Reed College, Portland, OR, United States
James Griesemer
Geographic Range as a Weakly Emergent Trait
This paper examines a crucial locus in the metaphysics of evolutionary theory. On the scientific side, we discuss the biology of geographic range — a property that is central within biogeography, macroevolution, macroecology, and conservation biology. Geographic range is theoretically important because it is the kind trait that can be shaped by species selection. On the
What Simon Should Have Said
Simon says, in “The Architecture of Complexity,” that evolved structures tend to be hierarchically organized and nearly decomposable for good dynamical reasons (Simon 1962). His insight has gained renewed currency in discussions of modularity in evolutionary developmental biology. At the heart of Simon’s essay is a parable of two watch-makers, Hora and Tempus. I
UC Davis, Davis, California, United States
argue that while Simon’s parable is clever, it does not work. I ask a developmental question to challenge Simon’s assumptions: how can either Hora or Tempus make watches at all? I show that Simon’s account is dynamically insufficient and suggest that what Simon should have said is that “scaffolding” helps answer the developmental question. Unfortunately, what Simon should have said about scaffolding undermines his argument for near decomposability, a result already in the literature (Wimsatt 1974). A different approach to complexity and organization that takes scaffolding interactions into account is needed to articulate concepts of descriptive and interactional complexity and modularity better suited to evolutionary developmental biology. I attempt to sketch an approach using scaffolding interactions to help identify boundaries of developmental modules.
their well-known argument that experiment has a ‘life of its own’. Scientists have local arguments for supposing that a certain experimental phenomenon is not an artifact and that its existence should be explained by any general theory of the domain in question. I extend this perspective from experiment to description. In mature scientific practices of description, such as anatomy and physiology, scientists have local arguments for supposing that a certain ontology of parts and processes corresponds to real structure in their objects of study and must be explained by any general theory of how those objects come into being.
Session: VIII.11 Room: Peter Chalk 2.1 Time: Saturday 9:00-10:30
Mathias Grote
Humboldt-University Berlin, Berlin, Germany
Session: VI.5 Room: Peter Chalk 1.3 Time: Friday 14:30-16:00
Plasmids: Between Autonomous Molecules And Symbiotic Organisms
Ever since plasmids were introduced into the emerging discipline of bacterial genetics in the 1950s, phenomena linked to these entities have been placed in an intermediary position in between organisms, cellular organelles and infectious agents like phage, thereby challenging autonomy and integrity of their hosts. Joshua Lederberg (1952) outlined plasmids as a general category of various phenomena in cytoplasmic inheritance, infectious heredity or pathology that were at the time “symbiotic organisms” and “part of the genetic determination of the organic whole”. Their role in molecular biology dramatically changed in the 1960s, following the experimental combination of genetic analysis and biochemical preparation. Recently, the position of plasmids in the framework of genetics seems to have undergone another change, in which their importance in “horizontal gene transfer” and hence evolution is stressed (Goldenfeld & Woese 2007). Plasmids appear no longer as parasitic phenomena
Paul Griffiths
University of Queensland, Brisbane, Australia
The Phenomenon of Homology
The concept of homology has evolved a great deal since its first mature formulation in the 1840s and there remains considerable controversy over the theoretical understanding of homology today. Surprisingly, this has not led to significant changes in the identity of specific homologues. Although great scientific ingenuity was needed in the late 18th and early 19th centuries to identify morphological homologies, most of those homologies are still recognised today. The same might be said of behavioral homologies identified by the first ethologists. I suggest that this is explained by two facts: operational criteria for determining homology carry a great deal of weight when compared to theoretical definitions, and there is a strong intellectual continuity between the operational criteria of the early 19th century and those of today.
In this paper I ask what these considerations tell us about the structure of the homology concept. I suggest that the concept is ultimately grounded not in a set of theoretical claims about homology, nor in any specific operational criteria, but in a phenomena (sensu Brigandt, 2003). It is a manifest fact that the same parts and processes can be found in different organisms and in different places in one organism, just as it is a manifest fact that (macro) organisms form species. Homology, like the existence of species, is a phenomenon that stands in need of explanation.
In seeking to understand the epistemic status of judgments of homology I draw on the ‘new experimentalists’ Ian Hacking and Alan Franklin and
of genetics, but as one among many manifestations of heredity.
This presentation will follow the historical development of plasmids along two lines. The first is the ambiguous status of these entities, in between organismal and purely physico-chemical structures; the second is their role as genetic shuttles. These analyses will focus on a turning point in both of these lines in the 1960s, when plasmids became material, visualized and workable things which then would serve as one of the most important tools of recombinant DNA technologies, by enabling transfer, storage and expression of DNA fragments. Thus plasmids became at the same time functional parts of organisms, physically defined molecules and tools crafted by humans. They are
therefore an interesting example to the historical and epistemological study of an important transition in molecular biology.
REFERENCES
Goldenfeld N. & Woese C. (2007), Biology’s next revolution, Nature 445, 369
his thinking about speciation. Geographic isolation was hard to miss in the oceanic archipelago where Mayr conducted the only major fieldwork of his lengthy career. Lately, the sympatric speciation controversy has taken on a life of its own whereas blueberry maggots and South Seas birds have gradually receded from view. The contentious status of sympatric speciation is still a sore spot for some entomologists, even as the blueberry maggot’s Latin name has stuck and its species status endured. Both stories about economic entomology and bird biogeography are essential to anyone wishing to explain and move beyond the controversy over sympatric speciation.
Lederberg J. (1952), Cell genetics and herditary symbiosis, Phys. Rev. 32, 403-429
Session: VII.3 Room: Newman D
Time: Friday 16:30-18:00
Jesse Gryn1, Christopher Buddle2, Charles Vincent3
1) Institut Maurice-Lamontagne, Pêêches et Océans Canada, Mont-Joli, Québec, Canada, 2) McGill University, Macdonald Campus, Ste. Anne de Bellevue, Québec, Canada, 3) Centre de Recherche et de Développement en Horticulture, Agriculture et Agroalimentaire Canada, Saint-Jean-sur-Richelieu, Québec, Canada
Session: III.6 Room: Peter Chalk 1.5 Time: Thursday 16:30-18:00
The Blueberry Maggot Goes to Harvard:
Guy Bush, Ernst Mayr, and the Controversy Over Sympatric Speciation
The story of how the blueberry maggot got its Latin name, Rhagoletis mendax, ends in 1964 at Harvard with the falling out of the aspiring entomologist Guy Bush and his teacher, the influential ornithologist Ernst Mayr, over the contentious concept of sympatric speciation. The evolution of new species is “allopatric” when it involves a geographic barrier, such as a mountain range or the water around an island, and “sympatric” in the absence of any such barrier. At Mayr’s Harvard, sympatric speciation was viewed as “little better than an heretical, unsupported fantasy” and Mayr’s universal theory of allopatric speciation reigned. Understandably, Guy Bush embarked on his thesis problem to “demolish claims” that sympatric speciation had occurred in Rhagoletis fruit flies. However, over the course of his Ph.D. work, Bush gradually came around to the disreputable views espoused by economic entomologists that geography did not play an all-important role in Rhagoletis evolution. Since their initial discovery in 1914, a fondness for blueberries over apples, rather than any geographic obstacle, seemed to distinguish blueberry maggots from apple maggots. Whether they were the same or different species remained an open question for the next fifty years. As a direct result of Guy Bush’s defiant dissertation, the blueberry maggot offically became Rhagoletis mendax, sibling species to R. pomonella, the apple maggot. The blueberry maggot’s story reveals how entomologists up to Bush arrived at their own non- geographical conception of speciation. Likewise, the historian Robert Kohler has recently shown how Ernst Mayr’s fieldwork with South Seas island birds formed
Morphisto Evolutionsforchung und Anwendung GmbH, Frankfurt am Main, Germany
The Frankfurt-theory of constructional morphology: An innovative but unknown approach for reconstructing anagenetic events and its actual importance for understanding chordate evolution
Michael Gudo
In the past few years, results from evo/devo-approaches undermined the majority views on deuterostome anagenesis. Popular models as the calcichordate- hypothesis, the derivation of deuterostomes from lophophorate or pterobranch-like forms or even from neotene larvae appear to be inconsistent with the „„molecular“ phylogenies. Instead, the hypothesis of a polysegmented ur-deuterostomian descending from an annelid-like protostome has been renewed. From a morphologist ́ ́s view it must be asked if this hypothetical ur-deuterostomian resembled a cephalochordate, i.e. an ancestral chordate which eventually lost its notochord in the “ambulacrarian”-line leading to hemichordates and echinoderms. The aim of the talk is to show that anagenetic models proposing a “complex” chordate-construction at the base of the Deuterostomia have already been developed fourty years ago in the so-called “frankfurt-theory”, which however is widely unknown in the angloamerican scientific community. This reconsideration of a widely ignored anagenetic model will also include the discussion of the scientific principles underlying the “frankfurt approach” as well as other reconstructions of anagenetic processes.
The frankfurt-theory argues about evolutionary transitions on the basis of structural-functional differentiations which are guided by functional aspects of organismic design. This means that at least basic natural laws, such as biomechanic, hydraulic, and thermodynamic principles provide the basic triggers for
evolutionary changes. Accordingly evolution has to be understood as a process of organismic changes throughout generations. The frankfurt theory summarizes evolution as the result of a morphoprocess; this processual view helps to determine the validity of different concepts in evolutionary research.
were obtained from only 6 people (71% from a single donor), equally arbitrary that the first described specimen of a species. In this sense, to consider the sequence obtained in this project as a reference genotype is to put under the same limitations that the taxonomic descriptions.
Session: IV.7 Room: Peter Chalk 1.6 Time: Friday 9:30-11:00
Session: XI.6 Room: Peter Chalk 1.5 Time: Saturday 16:00-17:30
Carlos Guevara-Casas
Mathias Gutmann
National University of Mexico, Mexico City, Mexico
University of Marburg, Marburg, Germany
Methodological convergence of conceptual interpretations in medicine and taxonomy
Since the work of Thomas Sydenham in the 17th century, diseases are seen in a similar way as the biological entities studied by systematics and taxonomists. In some sense, the comparison arises with the assumption of the existence of pathological entities as well as healthy subjects of reference. This means that health condition, the ideal condition, is analogous to the exemplary type (holotype) from which the characteristics of all species are described. However, the determination of the status of holotype is arbitrary. Regularly, the holotype is the first specimen collected and described in concordance with those cannons of the specialists in the systematics of the more closely related taxonomic group. In a similar way to the biological exemplars, the new tools of molecular Biology and genetic engineering have generated new criteria to establish a standard in the conceptions of health and illness. One of the difficulties of the implicit concept of genetic normality in the Human Genome Project (HGP) is that it does not admit variation. If the genome is considered the standard of normality, then all variation can be considered abnormal in some degree. The same tension existing between the referential role of the standard specimens of taxonomists (holotype) and the role of the variability for evolutionists and geneticists of populations occurs between the clinical reality of diagnostic and the medical assumptions derived from the HPG. The previous might be a consequence of the association of illnesses with genes, as one way to identify a gene is when there is a pathology derived from an alteration in the hereditary material. Although the are practical causes that allow the explanation of the search of correlations between certain pathologies and some genetic characteristics, it is also true that the only way in which the genetic origin of the diseases becomes the unique cause premise is to consider the immutability of the DNA and the paradoxical negation of the processes of genetic expression. Although the total sequence of the HGP does not come from a single individual, as in the case of the taxonomic descriptions of the holotype, the samples used by Celera Genomics
Is information a metaphor, an allegory or a model?
Metaphors are often understood as dysfunctional aspects of language. Nevertheless, rhetorical speech modes play a crucial role in the building and understanding of scientific theories. By discerning trivial from necessary metaphors (and closely related speech modes) their constitutive and indispensable function for theory-construction can be determined in a first step. The logical transformation of those speech mode into well-regulated model-descriptions, which allow the identification of the modellans as well as the modellatum is reconstructed methodologically in a second step. Finally the contribution of descriptions of this type on human self-understanding and self- constitution is discussed in the light of descriptions of human beings as “informational being”.
Session: VI.9 Room: Peter Chalk 2.5 Time: Friday 14:30-16:00
Neil Haave
University of Alberta, Camrose, Alberta, Canada
Why Teach History and Philosophy of Biology to Biology Majors?
In the early 1990s the Augustana Faculty (then Augustana University College) of the University of Alberta embarked on a curriculum review which resulted in our current liberal arts and sciences core curriculum. One result was the establishment of required capstone courses in each major. Capstone courses were to critically reflect on the historical development of the discipline. With educational resources provided to me by ISHPSSB I developed the course AUBIO 411 – History and Theory of Biology as the capstone for our Biology degree program. This course is open to fourth year Biology majors who are not expected to have background in history or philosophy.
The course is structured into three sections. The first lays a simple foundation in philosophy of science and
biology. The second section examines the historical progression of biology focusing on evolution, genetics and development. The last couple of weeks have students consider some of the social implications of our biological concepts. The point of the course is not to train students as historians or philosophers; rather, it is to bring students to an understanding of why biologists’ ask the questions they ask today.
Students typically begin the course with some trepidation unsure why they are being required to learn philosophy and history in a biology major but upon completion consistently express surprise at how relevant the course became for them. In this presentation I argue that undergraduate biology programs best serve students when they attend to not only developing students technical and conceptual capacity in biology but also by having students reflect on why the discipline is currently structured as it is.
utterly fail to account for functional normativity.
I seek to counter Davies’ argument in two ways. The first exploits the fact that selective success comes in gradations. This means that the distinction between success traits and generic traits that Davies relies on is arbitrarily coarse. But if this is corrected, repeated applications of Davies’ argument render the notion of a success trait incoherent. Secondly, I show that the etiologist is both forced and allowed to sort token traits into categories by appeal to homology, and that this undercuts Davies’ key premise. In closing, however, I argue that etiological theories do indeed have trouble accounting for normativity, though for reasons other than Davies suggests.
Session: II.11 Room: Peter Chalk 2.1 Time: Thursday 14:30-16:00
Sören Häggqvist
Did Fisher’s Voluntary Workers at Rothamsted Make a Difference in the Spreead of Statistical Techniques in Agriculture?
Sir Ronald Fisher (1890-1962) is well known in the fields of mathematical statistics and genetics, for his book The Genetical Theory of Natural Selection (1930) and for developing such statistical and experimental techniques as the Analysis of Variance. Many of Fisher’s foundational statistical methods were devised while he was employed as a statistician at Rothamsted Experimental Station, an agricultural facility north of London. His 1925 book, Statistical Methods for Research Workers, introduced the Analysis of Variance and the requirement of randomization in experiments, as well as other experimental design techniques. As Fisher’s methods became known, laboratories, government organizations and universities began sending their personnel for training to work with Fisher. These voluntary workers flocked to Rothamsted, which did not offer any monetary support, since their home institutions funded them to study there. Their period of residence varied from as little as three weeks to as long as three years. They eventually returned to their home countries, taking with them Fisher’s statistical methods and experimental designs. The thirty-five trainees thus far identified, returned to Australia, Brazil, Canada, Ceylon (now Sri Lanka), Germany, Greece, India, Ireland, the Netherlands, Sweden, Tanganyika (now Tanzania), Trinidad, Uganda, and the United States. This paper will ascertain if these workers actually translated the knowledge they gained from Rothamsted into statistical practice in their own academic disciplines.
Stockholm University, Dept of philosophy, Stockholm, Sweden
The Select Few: Etiological Functions and Normativity
One of the principal virtues claimed for etiological or so- called selected effect theories of biological function is their ability to account for the apparent normativity of function ascriptions. The idea, advocated by e.g. Ruth Millikan and Karen Neander, is that the selection history of a trait imposes norms concerning what the trait is supposed to do: its function. These norms apply also to instances of the trait type that are incapable of performing their task. Thus a malfunctioning or diseased heart may be unable to pump blood. But pumping blood is still its function, since hearts were historically selected and reproduced because they performed this capacity.
Paul Sheldon Davies has recently argued that this virtue is illusory. According to him, etiological theories entail that malfunctions are impossible. This is because they are committed to defining traits as functional categories; hence, as types individuated by reference to actual performance of the function in question. Etiological theories are therefore barred from counting the defective heart as a member of the trait type subject to the norm that it is supposed to pump blood. Conversely, this norm applies only to well-functioning hearts, since they exhaust the category defined by selective success. Hence compliance with etiologically established ‘norms’ is vacuous, and etiological theories
Session: II.7 Room: Peter Chalk 1.6 Time: Thursday 14:30-16:00
Nancy Hall
University of Delaware, Georgetown, DE, United States
This paper will survey the later practices of those (mostly) young scholar-trainees. Some of their papers include C.H. Goulden’s (1932) “Application of the Variance Analysis to Experiments in Cereal Chemistry,” R.J. Kalamkar’s (1932) “Experimental Error and the Field-Plot Technique with Potatoes”; T. Wake Simpson’s (1938) “Experimental Methods and Human Nutrition” and F.R. Immer’s (1945) “Some Uses of Statistical Methods in Plant Breeding”. Other papers applied statistics to experimental problems relating to apples, cotton, beets, barley, wheat, genetics, sociology and fishery practices. Their espousal of Fisherian statistics in their publications offers confirmation that these voluntary workers disseminated Fisher’s methods around the world and in many academic fields.
Session: IV.11 Room: Peter Chalk 2.1 Time: Friday 9:30-11:00
Arizona State University, Tempe, Arizona, United States
What Makes a Group an Evolutionary Unit?: Reliability and the Transition to Sociality
in Hymenopterans
This presentation raises issues similar to those discussed in the others in this session, but changes the focus to the evolution of sociality in insects. While the questions are essentially the same—Under what conditions do interacting individuals form an evolutionary macro-unit, and What are selection’s contributions at each level of organization?—the approach is quite different. Instead of emphasizing population structure and population genetics, I will be reporting on results of work in behavioral ecology now underway at the Center for Social Dynamics and Complexity and in the Social Insect Research Group at Arizona State University. Our approach is quite different than the one with which philosophers are most familiar, and it leads us to interestingly different answers to questions about what kind and how much cohesion is necessary for group selection to become an important force. The main difference, which I will draw out in this talk, is that behavioral reliability (for which there are realistic phenotype-level models) is emphasized over genetic models (for which there are not realistic phenotype-level models).
Session: III.9 Room: Peter Chalk 2.5 Time: Thursday 16:30-18:00
Andy Hammond
Independent, London, United Kingdom
Session: III.2 Room: Newman C Time: Thursday 16:30-18:00
JBS Haldane and speciation: not a beanbag but a full bag
Much has been written about the debate between RA Fisher and Sewall Wright in the 1920s and 1930s over the fundamentals of population genetics. But JBS Haldane’s role in this debate, although acknowledged, has not been examined in any detail. A similar point can be made over the acknowledgements that Haldane played a role in the Modern Synthesis of the evolutionary theory that crystallised in the early 1940s. A contrasting negative view of these contributions is that of Ernst Mayr, who in 1959 famously denigrated population genetics as “beanbag genetics.” The accusation was that population genetics reduced evolution to the addition and subtraction of genes from the gene pool of a population by the selection of individual genes (each with a fixed adaptive value). The discipline’s practitioners, including Haldane, were therefore restricted to this mechanistic and overly simplified picture of evolution. Common to all of these claims is the question of the nature of speciation. So in this paper, as a first step towards addressing these issues, I will track the development of Haldane’s ideas on speciation in the 1920s and 1930s. During this development he utilised not just population genetics but also chromosomal theory, biochemistry, embryology, ecological mechanisms, and the organism-gene- environment interrelationship. Haldane’s increasingly sophisticated picture of speciation does not fit the depleted or reductive view of the ‘beanbag’ but that of a ‘full bag’ —- i.e. a multifaceted/multidisciplinary approach driven by both his philosophy and his science.
Andrew Hamilton
Beth Hannon
Durham University, Durham, United Kingdom
Fetal programming, predictive adaptive responses and genecentric thinking
In recent years attempts have been made to construct an account of both development and evolution that does justice to the concerns of both. This paper will look at one specific area where developmental and evolutionary biology explicitly cross paths and examine some of the issues that arise from this. Research into fetal development has suggested that events in the womb can have a lasting, even life long phenotypic effect. Two models that have been offered to explain this
phenomenon will be examined. The fetal programming model suggests that events in utero can programme development in particular ways and as such influence later phenotypes. A second approach emphasises evolutionary considerations, and suggests the phenomenon is an example of a predictive adaptive response (PAR). Some authors who adopt the PAR model have argued that their approach is compatible with the fetal programming model. However this paper will suggest that this is not the case and will highlight fundamental disagreements between each approach, both in terms of the role they give to the environment, and the attention they pay to the details of developmental. While both models may highlight important aspects of fetal development, they can also be misleading. I will argue that this is because in both models the genome is considered to encode instructions for development. The consequence of this assumption for both models differs. In terms of fetal programming, the primacy of the gene in development is one of the factors that leads to the implicit assumption that the environment’s role in utero is primarily one of disruption. The assumption of the primacy of the gene in development also allows those adopting the PAR approach to sidestep the details of development and see predictive responses where there are none. I will argue that the view of development offered by developmental systems theorists can avoid the problems that plague both fetal programming and the PAR account. The developmental systems approach rejects the notion that the genome encodes instructions for development, and instead sees the genes as just one, among many, important developmental resources that interact in specific ways to produce phenotypes. I will suggest that this approach is better placed to give a satisfactory account of the effects of the uterine environment on later development as it can incorporate the positive aspects while avoiding the problems of both the fetal programming and PAR models. If this is the case, it will add weight to the claim that a developmental systems approach is a more useful way to approach the issues of development and evolution.
Time: Friday 9:30-11:00
such as quantifying a person’s risk of suffering from a certain disease in the future, confirming her family or group membership, her presence at a crime scene, her sensitivity to certain drugs, and the characteristics of her ancestors. Actors and institutions involved are individuals, medical researchers, police and travel agencies. Testing kits currently on the market create new social identities that upset established systems of social markers and produce imbalances in power negotiations. The public appears to lack knowledge of the limitations set to genomic testing for reasons of methodology and validity, and industry is criticized for its ruthlessness in selling hopes and presenting dubious conclusions as certainties. Scientists themselves voice disapproval of the way their science is being used to draw conclusions about identities. This talk presents a philosophically informed social scientific interpretation of the use of genomics in society as a major tool for the construction and affirmation of social identities. From this perspective genomics appears as extraordinarily successful in installing itself as a flexible societal instrument for marking identities; not least because it veils the constructionist character of these social classifications with the definiteness of presumed findings.
Session: IV.5 Room: Peter Chalk 1.3
Scientific Pluralism and the Evolutionary Explanation of Development
In a recent volume on scientific pluralism Helen Longino and Ken Waters argue, in separate works, that the debate about the gene’s centrality in the life-sciences is sustained by a widespread and mistaken view about the nature of scientific knowledge and inquiry. According to this mistaken view, for any phenomenon there is a single best method of inquiry to understand it. By ignoring the fact that different researchers can approach one and the same phenomenon with different questions, this monistic view hampers attempts to understand how, for example, phenotypes are produced. Consequently, Longino and Waters both endorse a pluralistic view of scientific inquiry. On their view, once the poverty of monism is revealed, the debate about genes ought to be seen as illusory.
Moreover, Waters argues that the focus on genes really is warranted but not because genes are the key to explaining evolution and development. (Waters is suspicious of the emphasis on explanation, generally.) Rather, according to Waters, genes provide a unique epistemological entry point for studying development in that we can learn a lot about development by manipulating genes. In service of this point, Waters
Christine Hauskeller
ESRC Centre for Genomics in Society, University of Exeter, Exeter, United Kingdom
The promises of genomics: only society makes them reality!
Gene-related science promised deep insights into important aspects of human existence. Therefore it should perhaps not be surprising that genetic tests are being increasingly employed for a variety of purposes
Session: II.9 Room: Peter Chalk 2.5 Time: Thursday 14:30-16:00
Jesse Hendrikse
University of Calgary, Calgary, Alberta, Canada
deploys a distinction between the investigative reach and the explanatory scope of a scientific approach. On Waters’ proposal, gene-centrism ought to be understood as an approach with broad investigative reach and narrow explanatory scope. However, I will argue
1) Waters’ distinction between investigative reach and explanatory scope is difficult to sustain when it comes to phenotypic development.
2) Parties to the debate about the centrality of genes in development are best interpreted as asking the same question about how phenotypes come about. So we can accept the pluralists’ point that different parties can approach the same phenomenon with different questions and still be moved by the debate about genes. After all, if two parties approach the same phenomenon with the same question, and arrive at different answers, they would seem to be genuinely at odds with one another.
Finally, I will show how this debate commends a (in particular my) pluralistic view of scientific explanation.
the theoretical work started by evolutionary epistemology. It tries to answer traditional epistemic enquiries analyzing the place where knowledge is produced: the brain. These three programs have accomplished some kind of reductionism, mainly an eliminative one, and this fact has brought them major impasses. What I will do is to review the mentioned programs, going deep into the last one. The twofold purpose is to show that there is no way back once naturalized epistemology has arrived, as long as knowledge is a subject to be disentangled interdisciplinarily. However, it is also pertinent not being captive into exaggerated versions of it, the ones that don’t take into consideration human reasoning. The indirect purpose is to point out that despite the impressive results reached by cognitive neuroscience in regards to knowledge, important philosophical work is required, i.e., well-done philosophy exerts important impact in cognitive science .
Session: VI.6 Room: Peter Chalk 1.5 Time: Friday 14:30-16:00
Richard Holdsworth
Egenis, University of Exeter, Exeter, United Kingdom
Session: XIII.10 Room: Peter Chalk 2.3 Time: Sunday 11:00-12:30
Paola Hernandez Chavez
Centro de Estudios Filosóficos Lombardo Toledano, Mexico City, Mexico
Different disciplines, different perspectives on the pertinence of genomics to ways of studying human behaviour: lessons of interviews with researchers
Since different research disciplines have their different perspectives on ways of studying the phylogeny and ontogeny of human behaviour, and since they also vary in the extent to which they make use of the findings of genomic research in their work, there is a discernable “trend towards disciplinary segregation” in this general field. As a check on this diagnosis, however, it is necessary to ask to what extent the human genome itself, since it underpins the diverse research efforts, serves as a connecting factor. If we find that developments in one area of the field can pull other others with it, we may conjecture that there are centripetal as well as centrifugal tendencies at work among the disciplines. The conclusion might be that success breeds success, and that the research tends to go where the money is, figuratively and literally. This paper takes a preliminary look at arguments such as these in the light of a series of interviews with researchers conducted recently in the context of an epistemological analysis of criteria for the conceptual mapping of research in the genomics of human behaviour.
Reductionism in Some Naturalized Epistemologies, or Why Philosophy Matters
At the end of the previous century a remarkable reformulation of epistemology emerged, i.e., naturalized epistemology. What is characteristic of this epistemology is its rejection to infallibilism and apriorism. It asserts that scientific empirical results are crucial to solve traditional inquiries about knowledge. Quinean naturalized epistemology claimed that we should abandon traditional epistemology and replace it with psychology. Another branch of naturalized epistemology is evolutionary epistemology, an approach to knowledge aiming to answer traditional epistemological questions based on the theory of evolution by natural selection. It has two different but interrelated programs, the first of them (EET) accounts for scientific theory change as resembling the mechanisms of natural selection theory. The second program (EEM) studies the development of our cognitive capacities and structures as well as their fixation in our brain along evolution. It is the extension of biological theory of evolution to cognitive activity and its apparatus like the brain and sensory and motor systems. After evolutionary epistemology, neurophylosophy, another brand of naturalized epistemology arouse, one which is continuing much of
Session: XI.3 Room: Newman D Time: Saturday 16:00-17:30
Session: XI.10 Room: Peter Chalk 2.3 Time: Saturday 16:00-17:30
Nick Hopwood
Philippe Huneman
Department of History and Philosophy of Science, University of Cambridge, Cambridge, United Kingdom
‘Skandalon’: Haeckel’s pictures of embryos in the struggle of world views
As I write this abstract (February 2007), creationists and their opponents are locked in polemics over the use in twentieth-century textbooks of comparative embryological illustrations that the German evolutionist Ernst Haeckel was first accused of forging some 130 years ago. The pictures became controversial during the initial middle-class reception of Darwinism in the 1860s and ’70s. But the issue only became big news in 1908–10, as part of a much larger, more highly organized and more intense ideological struggle over working-class reading. This talk will compare this debate to the earlier controversy and discuss its legacy. How, I shall ask, have public contests over scientific pictures changed?
Time: Thursday 16:30-18:00
Leon Croizat: A Radical Biogeographer
The papers contributed to Rebels of Life tend to concern twentieth century biologists who not only held iconoclastic views but also turned out to be basically right. But the vast majority of iconoclasts turn out to be wrong. In my paper I chronicle the life of Leon Croizat. For most of his professional life Croizat was clearly an outsider, objecting to the received views on biogeography, but in the early 1970s, advocates of cladistic analysis took on Croizat and his views. Although various versions of cladistic analysis have become widely accepted, Croizat’s panbiogeography continues to be considered marginal at best. In my paper I set out what I take to be the most important factors that played a role in the differing fates of these two scientific research programs. To be sure, reason, argument and evidence played an important role, but so did other factors including those that are psychological and sociological. Even such things as wars can influence the course of science. Did cladists win and Croizat lose, and if so, what counts as “winning” and “losing” in science?
Different research practices in early molecular genetics: Oswald T. Avery’s and Max Delbrück’s revolutionary findings and early responses
IHPST CNRS, Paris, France
Session: III.1 Room: Newman B
Evolvability, transitions and the emergence of new individuals
Although a gradualist theory of evolution, neo- darwinism makes room for discontinuous changes, under the names of (key) innovations, characterized by some causal ecological and phylogenetic patterns. Evolutionary transitions studied by researchers after Maynard Smith & Szatrhmary (1995) or Michod (2001) don’t fit this schema, because they display the ermgence of new individuals, emergence taken here in the sense of the possibility of new measures, emergence taken here in the rigorous sense of the possibility of new measures (multilevel selection 2 in the sense of Damuth and Heisler 1988) together with the arising of a new kind of causality – i.e. a new form of selection that will buffer the novel individual against perturbations. This irreducibly involves multilevel selection since at two levels of individuality selection acts in opposite ways (towards cooperation or towards defection).
Thus, after having distinguished evolutionary transitions from evolutionary innovations by showing that they require a formal rigorous concpet of emergence, I will address the issues of adaptation and evolvability in this framework. Usual adpative explanations are framed in the context of a selection vs constraints (phylogenetic/developmental) issue, whereas here what accounts for a drop in fitness is selection acting at another, higher, level; tehreby new kinds of adaptations are connected with across-levels trade offs in fitness. In the process of emergence of new individuals understood as cohesive wholes of lower level individuals, one can’t equate adaptation with optimality or inctrease in fitness, or even equate adaptation with optimality under constraints. Especially, before the buffering mechanism that enforces cooperation, thereby ensuring the consistencey of the new individuals, since there are no adaptation in the sense of traits being selected through the higher selection process, one could make sense of the equivalent of preadaptations at the higher level of individuality.
Finally, evolvability is traditionally thought as the availability of variations through mutations or recombinations (in the case of sexual reproduction); however while each transition triggered some amount of evolution, the evolvability gained in those cases has to be thought in other terms. I will argue that evolvability there must be defined according to the deocpling of selective processes at several levels. This makes a case for a concpetual relation between emergence and evolvability.
David Hull
Northwestern University, Evanston, IL, United States
Session: I.9 Room: Peter Chalk 2.5 Time: Thursday 11:30-13:00
evolutionary theory. Evolutionary theory is still understood in terms of the neo-Darwinian Modern Synthesis, which denied any significant role for Lamarckian and saltational processes. Although there is no doubt that the cumulative selection of small, blind, genetic variations plays an important role in evolution, Lamarckian processes are clearly significant too, and under certain conditions can lead to targeted and saltational changes that reorganize the epigenome. I argue that in the light of new data, mainly from studies in molecular and developmental biology, the Modern Synthesis no longer offers an adequate theoretical framework for evolutionary biology. A new framework for evolutionary thinking is required, and its construction is underway.
Simon Huttegger
Konrad Lorenz Institute for Evolution and Cognition Research, Altenberg, Austria
The Evolution of Simple Communication Systems
Information transfer is ubiquitious in biological systems. Sender-receiver games may serve as a convenient baseline model to gain understanding of how efficient transfer of information can evolve. In a sender- receiver game a sender picks one among a number of signals after some event has occurred. As a response to this signal, a receiver picks one among a certain number of actions. To get a positive payoff, sender and receiver must coordinate events and actions.
Although sender-receiver games have been studied for a while now, their evolutionary dynamics is not fully understood yet. Standard evolutionary dynamics as given by the replicator equations does sometimes lead to states of perfect communication. Sometimes it converges to a component of states of partial communication which may be far from efficient. On such a component, neutral drift is possible. Moreover, these components are structurally unstable. This means that they will not persist under certain perturbations of the replicator equations.
Sender-receiver games thus raise a number of interesting issues. Besides from being a baseline model from which to study information transfer in biological systems, the existence of connected components of stable states as well as their structural instability are important for evolutionary models in general. I will discuss some recent research on the evolution of communciation in sender receiver games. I will focus in particular on the differences between the replicator equations and other kinds of evolutionary dynamics.
Session: VIII.1 Room: Newman B Time: Saturday 9:00-10:30
Session: IV.4 Room: Peter Chalk 1.1 Time: Friday 9:30-11:00
RSSS, Australian National University, Canberra, Australia
The Archaeology of Cultural Inheritance in Early Homo
Archaeology has long relied upon a notion of cultural inheritance. Tool classifications and seriation — the temporal ordering of archaeological data through time— implicitly rely on an assumption of cultural inheritance. However, recent archaeological work examining the manufacture of tools by hominins threatens to dismantle traditional tool typologies. Tool types may well be stages in reduction sequences, rather than manufactured to specific cognitive “templates.” Cultural Inheritance in such a context should be seen not as the inheritance of a specific set of tool types, but rather as the inheritance of a method of manufacture. Consequently, we need to examine modes of inheritance, including niche construction, within the context of manufacture, to see whether they can make sense of tool lineages. I contrast two dynamics within stone tool making: Decision making at the level of the individual that utilises ideas of the extended mind outlined by Andy Clark, and the emergence of regionalisation of tools in the middle stone age. I argue that cultural inheritance needs to be seen in the context of the interplay of these two complimentary processes.
Eva Jablonka
Tel-Aviv University, Tel-Aviv, Israel
The Developmental Aspect of Heredity and Evolution
The inheritance of developmental variations - phenotypic variations that are independent of variations in DNA sequence, and targeted DNA changes that are guided by epigenetic systems – is now recognised to contribute to the transmission of information between generations of organisms, as well as being a crucial part of their ontogeny. However, such hereditary transmission has not yet been fully incorporated into
Ben Jeffares
Session: XII.5 Room: Peter Chalk 1.3 Time: Sunday 9:00-10:30
was certainly important to the thinking of many early twentieth-century genetics researchers, Jennings stands out among Americans for his philosophical pragmatism and concomitantly, for the emphasis he placed on the progressive rather than the selective nature of evolution. This paper considers Jennings’ experimental genetics research program and his public positions on education and eugenics in light of his pragmatism and his focus on progressive evolution. It considers the impact of Dewey, James, and the broader pragmatist milieu in which he functioned on his scientific products inside and outside the laboratory. Through a brief examination of strategies of visual representation, I will conclude by contrasting the visual imagery of Jennings’ pragmatist genetics with that of competing early twentieth-century genetics research.
Chris Jenson
University of Utah, Salt Lake City, Utah, United States
The Case for a Frequentist Interpretation of Fitness
I argue that a frequentist interpretation of probability
is the most appropriate for conceptualizing fitness in evolutionary theory. John Gillespie (1973, 1974, 1977) demonstrated that if we understand fitness according to the propensity interpretation Mills & Beatty 1979), then the spread of a trait depends on statistical facts that are independent of any individual organism’s fitness. Beatty and Finsen (1989) have noted that more than one statistical operation (e.g. arithmetic mean, variance, skew, etc.) may be applied to the distribution of reproductive probabilities that fall out of the propensity interpretation and that no one statistical operation is suitable for every circumstance. These problems arise because the propensity interpretation treats fitness as a property of individual organisms rather than of populations. These problems can be avoided by treating fitness as a property of populations. This is in line with Fisher, Haldane, and Wright who developed genetical theories of evolution by natural selection which make no reference to the properties of individuals and their relation to environmental conditions. Kitcher and Sterelny (1988) argue that the aim of evolutionary theory is to, “make clear the central tendencies in the history of evolving populations.” I am advocating the application of population thinking to the concept of fitness. If fitness ought to be treated as a property of populations rather then individuals, then a frequentist interpretation of the probabilities involved is the most natural. Instead of averaging over a distribution of the possible reproductive outcomes of individuals, I propose that fitness be treated as the limiting frequency of a genotype or phenotype given an environment or distribution of environments.
Session: II.8 Room: Peter Chalk 2.6 Time: Thursday 14:30-16:00
Session: XII.11 Room: Peter Chalk 2.1 Time: Sunday 9:00-10:30
Konrad Lorenz Institute for Evolution and Cognition Research, Altenberg, Austria
A Story About Story Telling
Before we learned in the 1990ies that women do show signs of ovulation, more than 30 authors expressed their independent opinion on the question why ovulation is concealed in humans. The well-accepted fact that humans are nearly unique in this respect had posed a riddle on the evolutionary background of this trait.
The explanations were manifold, but can be grouped into nine basic ideas that cover, beside other aspects, the “good old values” like social bonding and monogamy, but also, from the late seventies on, the idea of extra- pair-copulation with a focus on women ́ ́s benefits.
The riddle of concealed ovulation is a good example for the scientific effort to explain a phenomenon, even though data is lacking, by relying on basic evolutionary assumptions, being willing to drop wrong hypotheses, and immediately incorporating new insights.
I will present the different approaches, basic assumptions, and their main proponents, and try to link the main hypotheses to the scientific developments in primatology and human behavior at the time.
Judy Johns Schloegel
Independent Scholar, Illinois, United States
What Does a Pragmatist Genetics Look Like? Herbert Spencer Jennings and the Politics of Evolution and Heredity
Evolutionary theory was central to nearly every aspect of Herbert Spencer Jennings’ upbringing, education, and self-conception as a scientist. It organized not only his metaphysical assumptions but his experimental methodology and research program. While evolution
Astrid Juette
Session: X.11 Room: Peter Chalk 2.1 Time: Saturday 14:00-15:30
stability? Recent work appeals to stabilizing selection to account for the stability, but obviously this simply forces the question of why stabilizing selection is so prevalent and what accounts for it?
In this paper, I suggest that the stabilizing selection responsible for evolutionary stasis might be related to the mechanisms that actively produce developmental stability. If this is correct, the main factors responsible for stabilizing selection are not stable sets of external environments that the population is adapted to, but rather the necessity of maintaining developmental stability in the face of changing environments. This has obvious implications for arguments that attempt to link breakdowns in the stabilizing systems to important speciation events.
Hyo Yoon Kang
Max Planck Institute for the History of Science, Berlin, Germany
Genes are patents, patents are genes: the rise and fall of a scientific metaphor in legal analogy
The ‘gene’ has been characterised as a ‘concept in tension’ (Falk). This paper argues that a patent is as ambiguous as a legal form. In patent law, the concept of the gene continues to remain rather undefined and hovers around different types of claims and patent categories, and as this analysis suggests, with a trend towards its diminishing occurrence.
In an effort to gain a historical understanding of the transfer and re-characterisation of the ‘gene’ in patent law, the paper traces the employment of the term ‘gene’ in U.S. patent documents. The study presents a sharp decline of its usage in patent claims between 2000 and 2005, after a significant rise in the 1990s. The analysis furthermore indicates that the so-called ‘gene patents’ have involved very different forms, concrete materials, various techniques, proprietary scope and subsequent licensing strategies. What patents in genetic material and information seem to effect is a folding of what should be understood as only proxies or allegories of an material entity back into itself by imbuing them with an independent existence and varying social force by patent law’s codification of a potential (monetary) value. The unstable concept of the gene has thus been framed by an equally unstable and oscillating potentiality of the legal form of patent. It is precisely their transferability and sociability, and ultimately their elusiveness, which have marked both ‘gene’ and patents as slippery and therefore powerful constructions without a clear indication of what the precise nature of the link between ‘gene’ and the essence of intellectual property right consists of.
Session: XII.10 Room: Peter Chalk 2.3 Time: Sunday 9:00-10:30
Jonathan Kaplan
Oregon State University, Corvallis, OR, United States
Evolutionary Stasis and Developmental Stability: Are they related?
The so-called “paradox of stasis” notes that many groups of organisms do not undergo appreciable phenotypic evolutionary change over very long periods of time. While for example Eldredge and Gould’s “punctuated equilibrium” was an attempt to account for this phenomena, it failed to address what many consider to be the big question – why was there so much
Fred Keijzer
University of Groningen, Groningen, Netherlands
Session: XII.3 Room: Newman D Time: Sunday 9:00-10:30
Animality: Where Cognition Might Start As A Biological Phenomenon
Suppose we accept the claim that we lack, and need, clear ideas as to what cognition might be and, further, that biology is the right place to look for more definite answers. How to proceed from here? A helpful suggestion comes from work within the embodied cognition movement, which situates the foundation of cognition in a perception-action context, as Tim van Gelder does in his famous paper, ‘What might cognition be, if not computation?’ An important problem with this work is that the notions of perception and action remain part of our intentional vocabulary and are as difficult to pin down as the concept of cognition itself. ‘Perception’ and ‘agency’ remain to a large extent a matter of ascription, typified by Dennett’s intentional stance, which can be applied to humans, animals, robots, heat-seeking missiles, and thermostats alike. The concepts of perception and action do not necessarily refer to a specific kind of physical organization that can be teased apart and studied in detail to help us understand these notions better. Within a biological context, however, the concepts of perception and action can be specified more precisely, as here they do point to a particular sort of material organization. Within the biological domain, perception and action are exemplified by free-moving creatures, which exhibit motility initiated, guided and modulated by feedback-based sensory processes. This dynamic sensorimotor organization is a physically describable setup that can take many different forms and can be cast in evolutionarily incremental stages, each being highly specific to the kind of organism involved but with general features shared with other organisms. This setup does
not require a nervous system, as research into the motility of free- and group-living bacteria demonstrates, but sensorimotor organization can become much more complex and extended if a nervous system is present. In this paper, I introduce the notion of animality to refer to the domain of dynamic sensorimotor organization found in nature. Using the hydromedusan jellyfish Aglantha digitale as a case study, I will sketch the differences between animality and the notion of agency, and argue that animality provides a promising way to ground perception and action in a specific kind of biological organization. Once perception and action are adequately grounded, the next step toward understanding cognition should become clearer.
Time: Sunday 11:00-12:30
The Ontology of Race
Race is one of the most regularly used discriminators to determine to what kind a human being belongs. I will challenge the increasingly popular view that seeks to define race in terms of genomic and phylogenetic lineages. In this view, an individual’s race is perceived as phylogenetically traceable, written into one’s genomic inheritance or, as has been recently suggested, in terms of certain genomically identified clusters of racial indicators which can be used to underlie racial identity. If we solely rely on locating biological entities in terms of an account of ancestral lineage, this results in treating the contents of biological thought as static points or entities. No matter how helpful this ontology may be in isolating the crucial parts or mechanisms within a biological system, it will not grasp the process as a continuous spatiotemporally directed activity which has to be understood beyond the skin of the individual in relation to her interaction in the environment. This is a process which is causally reciprocal between cultural, psychological and development factors. Furthermore, by focusing on one aspect of racial identity we fail to consider the actions of biological agents as processes towards a telos. Therefore, only a limited understanding of an ontological category such as race is available if data is interpreted purely phylogenetically. That view needs at least to be supplemented by the ontological or conceptual lens of a teleological process ontology.
I go on to show that at the level of biological ontology there are categories, such as race (and species) which are inherently perspectival. That means that there can be a multiplicity of race concepts each with limited warrant. To say that these are inherently perspectival (from the point of view of biological ontology), does not mean that there are no arguments that can be made in
favour of some concepts over others. This is indeed what some have done concerning a race concept (A. Appiah, L. Outlaw). They hold that certain ways of picking out race are better than others. Those ways of picking out race need to respond to the ways in which race develops, as a result of a number of reticulating cultural and psychological factors. These include the change in communication between individuals over time, the reflection of individuals and groups of individuals upon this expression, and the ways in which individuals inform their own and each others’ identification within a particular group. Instead of a limited fixed ontology of race, by which race is understood once and for all, I argue that the concept race has to be understood in terms of a causally reflexive network of cultural, psychological, biological, and developmental processes.
Session: XIII.4 Room: Peter Chalk 1.1
Katie Kendig
University of Exeter, Exeter, Devon, United Kingdom
Session: VIII.6 Room: Peter Chalk 1.5 Time: Saturday 9:00-10:30
Elselijn Kingma
University of Cambridge, Cambridge, United Kingdom
Harmful Environments: A Problem For The Bio-Statistical Theory Of Health
Christopher Boorse (1977) purports to give a value-free theory of health. According to his Bio-statistical theory (BST), health is the absence of disease, and disease is the adverse departure of normal species functioning. On Boorse’s definition a normal function of a part or process of an organism is a statistically typical contribution by it to the organism’s overall goals of survival and reproduction.
One of the problems faced by Boorse’s account is that the normal contributions of parts and processes to overall survival and reproduction vary enormously in response to specific environmental demands. Blood pressure, for example, is much higher in strenuous exercise than in rest. Boorse’s definition of normal function must therefore be amended: the normal function of a part or process is the statistically typical contribution of that process to survival and reproduction on a specific occasion or in a specific environment. A blood pressure above 140 is thus normal and healthy whilst doing exercise, but not normal and healthy whilst sleeping.
This amendment raises a further problem for Boorse, which is that in some environments (such as one infested with carbon-mono-oxide) the statistically normal species reaction is a disease (namely asphyxiation). Boorse (1997) therefore amends his theory by introducing the notion of a harmful environment. In keeping with his claim that the BST is value-free, he proposes to analyse a harmful environment as an environment that is statistically
abnormal. A condition resulting from an such an environment is, on this amended account, a disease.
I argue that it is not possible for Boorse to give a statistical account of harmful environments, because harmful environments do not equate ‘rare’ or environments. I then propose three other solutions that Boorse may wish to use to solve the problem of diseases that are normal responses to harmful environments, and show that each of these do not work. I conclude that it is not possible for the BST to give an account of disease caused by environmental agents. Since these not only cover trauma, but also infections, it represents such a large part of the disease spectrum that it is an objection Boorse cannot ignore.
Time: Friday 14:30-16:00
and developed a new type of knowledge making process based on the ideal of “curiosity”: disinterested quest for total knowledge, which is not always driven by immediate profitability.
The final section of the paper explores how the collective exploration of the private was transplanted into colonial contexts. Liefhebbers used their “disinterested” quest for total knowledge of colonial natural resources, so as to distinguish themselves from other planters, who were only seeking for their private profit. The investment of their private time and resources for colonial agricultural projects underscored their imperial identity, that set them apart from both non-Europeans and other European planters.
The paper attempts to understand the early-modern idea and practice of the “private” in its historical context and question a persisting tendency among historians to read the more recent conceptual devices, such as the separate spheres and feminine domesticity, into this period. The paper also attempts to de-center and localize the social and cultural process of truth- making in early modern period and analyze how the ideas, such as “curiosity,” were appropriated in various local contexts on both sides of the Atlantic Sea.
Session: VI.10 Room: Peter Chalk 2.3
Tomomi Kinukawa
University of the Pacific, Stockton, CA, United States
Metamorphosis of the Private Time: Natural History as Entrepreneurship in Early Modern Dutch Atlantic
This paper examines a group of entrepreneurs in the Netherlands in the second half of the seventeenth century to early eighteenth century, who established themselves as writers of nature through their business in inventing new models of nature studies at bourghor homes. In particular it focuses on a new genre of literature on insects’ metamorphoses that those entrepreneurs invented and its use in art works, private gardens, collections of art and nature, as well as in a colonial agricultural enterprise.
This paper is an attempt to read those texts, illustrations, and other art works as a process in which those entrepreneur naturalists explored with at most intensity the new possibility of the “private” as resources for their entrepreneurship and a new type of knowledge production. The entrepreneurs gave both textual and visual expression to their intense experience of flowing private time as they gazed insects’ metamorphoses in their private recess and their desire to control and possess the flow of private time through manipulating their private body and material resources at home. Responding to the expansion of the global market and trade, those entrepreneurs exploited their private time and body to secure the place of local production. Furthermore the entrepreneurs used the private time and body to form the new collectivity of liefhebbers, that transcended their occupational identities and professional boundaries. In their private recess, liefhebbers found a legitimate home for both material and knowledge production outside traditional institutions, such as guilds, universities, and courts,
Session: III.10 Room: Peter Chalk 2.3 Time: Thursday 16:30-18:00
Marianne Klemun
Institut für Geschichte, Universität wien, Austria
Systematic botany in the romantic Vienna and “Voyages into the Flower Fields of Life”
Expressing feelings tenderly, mysteriously and indirectly means “saying something with flowers”. This powerful norm of the bourgeois culture was invented around 1800 as “the language of flowers or the art of being able to convey any message with the help of flowers” and was also expressed in books of propriety for women. Their aim was to locate the signs perceivable with the senses via cultural standardisation in a communication about sense and to establish them as bourgeois knowledge. I will address the fact that apart from aphorisms, which always contained a certain triviality, also a scientific study in the field of botany, a “Flora of the Austrian Imperial State”, published in 1816 in Vienna, showed this tendency.
Works on the flora usually contain a scientific enumeration of the plants growing in a certain area. The form and structure of their representation had been standardised since the 18th century. The “Flora” written by the custodian of the Imperial Cabinet linked knowledge in the field of natural history and the mobile elements of an economy of romantic sensitivness with
the signs of the plants as bearers of cultural codes, with the usual rigid grid-like order of flowers and plants being abolished. Are we dealing here with a trivialised romantic harmonisation of nature and culture or with a specific knowledge of women or with hidden criticism of the methods of science?
Session: VIII.9 Room: Peter Chalk 2.5 Time: Saturday 9:00-10:30
Session: XII.4 Room: Peter Chalk 1.1 Time: Sunday 9:00-11:30
Evolution Of Social Behaviour By Group Selection
It is usually thought that evolution of social behaviour, just like any other behaviour, is guided by individual selection. Group selection explanations are thought to be needed only in the (rare or non-existent) cases of evolutionary altruism. The idea is that sometimes, although individual selection is always the primary force of evolution, group selection pulls the evolution into another direction. Evolutionary altruism is where workings of group selection can be seen. I argue that this picture is faulty. My main thesis is that social behaviour is always guided by group selection: group level is the level of causal interactions that determine which behavioural dispositions will be selected. The conflict in evolutionary altruism is the only case where the individual level considerations become relevant, not the other way around: the decrease in an individual’s relative fitness is, of course, a constraint for the trait’s evolution, but in the cases of evolutionarily selfish social behaviour this constraint is simply missing. Evolutionary selfishness in not an explanatory factor as such. I will present my argument by discussing the modelling of reciprocal altruism with game theory and Elliot Sober’s and David Sloan Wilson’s take on that, then generalising my points. I agree with Sober and Wilson that reciprocal altruism (modelled as a tit-for-tat strategy) is a product of group selection, for the reasons they present, but I will argue that it is still an evolutionarily selfish trait. In order to do so I point out confusion between two kinds (or levels) of altruism (evolutionary altruism and what I call “behavioural altruism”) and a confusion in what exactly counts as a trait here. I suggest that there are two possible (complementary and non-contradicting) ways to individuate a trait: the behavioural disposition of an individual and the interaction within the trait groups. The latter, a group level trait, is what is selected for and this selection is what in fact is modelled in group selection models. The individual level behavioural disposition is not an independent trait for evolutionary considerations and its evolution cannot be explained by individual selection, but the genetic makeup for that disposition is what gets selected. Sober and Wilson implicitly use a third way to individuate a trait in their discussion, but this, as I will show, is clearly a wrong way. The relation between social behaviour as a group level trait and the individual behavioural dispositions underlying it is a complex one and makes the relations between altruistic behaviour, evolutionary altruism and
Gal Kober
Boston University, Boston, MA, United States
Biology without Species
Of the numerous species definitions found in the literature, not one has been suggested which could be applied to all groups of organisms that biologists might want to distinguish as species. These definitions fail in various ways: they are usually applicable only to a limited range of organisms, and species demarcated by different types of definitions most often do not converge. Moreover, the different definitions attribute the label ‘species’ to disparate taxonomic levels; nor are they applicable to a single, unique level – a conclusion reached by noting that what is found to be in common to species categories also applies to higher taxa. This leads me to view the taxonomic category of species as ill-defined. I go further to claim that it is effectively unnecessary.
I claim that in fact not much is lost once the category of species is dispensed with. The main thing that species are thought to be doing is evolve, and so we may ask what is it that evolves if not species. I would like to suggest that losing the species category would not make a great difference in the theory of evolutionary biology, since, for the most part, the term ‘species’ is a mere placeholder for precisely this question: it is ‘that which evolves’, or sometimes just ‘the group of organisms at the very end of a lineage’. My alternative suggestion is based on the General Lineage concept suggested by de Queiroz. It is an attempt to replace the concept of species by focusing on the single component that seems to be shared by all approaches and definitions, however diverse: that species are lineages. While this is indeed a common and significant feature but it does not appear as a single kind of lineage, nor does it suffice in order to set species apart from other taxa. But while it does not save the species category, it certainly points in the right direction: a ‘species’ as a segment of a lineage in a cladistic framework, with the lineage now being considered as what evolves. I suggest abandoning the species category – the species rank of the classificatory tree – for a view of ‘species’ (or – the concept formerly knows as species) as a segment of the tree and a rank- free account of lineages and evolutionary development.
Tomi Kokkonen
University of Helsinki, Helsinki, Finland
group selection equally complex, but the causally relevant level is the group level. In conclusion I argue that what I’ve shown about reciprocal altruism holds for all social behaviour.
Time: Saturday 11:00-12:30
Biology And The Other Sciences; Autonomy And Cohesion
Booming multidisciplinary life sciences challenge the autonomy of biology, the assessment of which should involve the autonomy and cohesion of all sciences.
In my view biology is as much (or as little) autonomous as its sister special sciences, such as chemistry and astronomy, over which the labor of studying the various levels of aggregated matter is divided. Implied is their joint dependence on general physics. One argument concerns the range of universality of laws. Scientific laws allow predictions of the behavior of the research objects of our interest. To be considered universal, a law of nature should be spatiotemporally unlimited, that is, holds over all times and places. Since life on earth as familiar to us is not necessarily unique, biological law- generalizations are not spatiotemporally limited.
That is not to say that biological laws are as universal as the fundamental laws of general physics. As I will argue, there is an overseen notion of universality in level of matter configuration. The special sciences, such as astronomy, chemistry and biology, focus on one particular level of matter configuration; their laws contain mathematical variables that refer to (properties of) stars, molecules, and organisms respectively. Only the fundamental laws of general physics, such as the law of gravity, do not refer to matter at one particular configuration level but to (properties of) of matter configurations in general. Biological laws do not match this range of universality. Does this imply that biology ought not be considered an autonomous science after all? If so, it will share this fate with all special sciences such as astronomy and chemistry.
Indicating the nature of science cohesion I will discuss the asymmetry of crosslevel bottom-up and top-down explanations. Instantiations of (meta)stable matter configurations at a particular level (such as atoms, or molecules, or organisms) are explained partly by the causal interaction of their constituents from a less inclusive level (nucleons and electrons, or atoms, or molecules, respectively). Bottom-up explanations need to be complemented by top-down explanations invoking matter configurations from more inclusive levels; there
were no gold atoms before stars and no hemoglobin molecules before organisms. Bottom-up explanations refer to the charge based causal interaction of the constituents of a less inclusive level. Top-down explanations however refer to the local conditions as offered within the configurations of a more inclusive level. Instantiations at any level may switch roles.
Session: IX.9 Room: Peter Chalk 2.5
Diedel Kornet
Leiden University, Leiden, Netherlands
Session: VI.8 Room: Peter Chalk 2.6 Time: Friday 14:30-16:00
Alexei Kouprianov
European University at St. Petersburg, St. Petersburg, Russian Federation
“If we only could combine Tournefort’s drawings with Rivinus’s definitions”: the positive program by Johann Georg Siegesbeck (1686-1755) for systematic botany
Johann Georg Siegesbeck (1686-1755), a German-born physician and naturalist, demonstrator of the Physical botanical garden and later a professor of the Academy of Arts and Sciences in St Petersburg is notorious for his rejection of Carl Linnaeus’s sexual system of plant classification on moral and theological grounds. His critical essays were opposed by the pamphlets signed by a German botanist Johann Gottlieb Gleditsch (1714- 1786) and a Swedish theologian Johan Browallius (1707-1755).
Even though this controversy, full of spectacular details including Linnaeus’s infamous joke with Siegesbeckia orientalis seeds labelled Cucullus ingratus (ungrateful cuckoo), attracted a considerable attention of scholars, there is still a room for further research. A closer examination of Siegesbeck’s works shows that his criticism was not based solely on moral or theological grounds but contained some botanical criticisms as well, which were to a lesser extent discussed by the historians being seemingly too technical. The present paper aims at a more symmetrical account of the controversy. It focuses on the Siegesbeck’s positive program for systematic botany set against the background of the contemporary taxonomic practices and on Siegesbeck’s relationships with his colleagues in St. Petersburg, which eventually led to his retirement in 1747 and the takeover of the Academy by the botanists supporting Linnaeus’s reforms.
Session: V.11 Room: Peter Chalk 2.1 Time: Friday 11:30-13:00
Session: VIII.1 Room: Newman B Time: Saturday 9:00-10:30
Ulrich Krohs
Maria Kronfeldner
University of Hamburg, Hamburg, Germany
Max Planck Institute for the History of Science, Berlin, Germany
How Systems Biology Makes Sense of (Gen)omics
According to Dobzhansky, it is the light of evolution which is required for making sense of biological findings. The light of evolution fails, however, to make sense of the data collected in genomics, at least in the important respect of understanding the causal role of particular genes in developmental and physiological processes. Only the view of systems biology allows us to make sense of genomics (and of the results of other “omic” disciplines). I shall discuss why this is so by presenting an argument that is based on two theses, both of which look fairly innocent. In addition, reference is made to an important empirical result and to a basic evolutionary assumption.
The first thesis I argue for interprets the “making sense” part of the claim: making sense of scientific findings requires that an explanatory model is built that accounts for the data. The second thesis is that a model in biology needs to address the causal roles of interacting components to count as explanatory. The empirical finding that enters the argument is the following: on the macromolecular level, biological systems show distributed functionality (where “functionality” refers - àà la Cummins - to causal roles and not to etiology). This means that a particular causal role can not usually be ascribed to particular macromolecular components but is realized by a larger network as a whole. The contribution of different paths and components to the behavior of the network are somewhat flexible. Consequently, the causal roles of a particular component depend not only on its properties, but on the system in which it is embedded.
It follows that knowledge of a set of genes is not sufficient a basis for modeling the causal roles the genes and their products fulfill in a cell, not even when the products and their primary effects are known. Knowledge of the interactions within the network is indispensable for such an explanation. These interactions cannot be read from the sequence of the genome; they are to be investigated by systems biological methods. Therefore, only systems biology allows us to make sense of genomic data. This is even true for phylogenetic explanations of the structure of a genome, as can be seen when the biological assumption that not genotypes but phenotypes are selected in evolution is added: Not even changed causal roles of modified components make the difference for selection; only the resulting modifications of organismic traits do. So nothing in genomics, even not phylogenetic findings, makes sense except in the light of a systems perspective.
In the Name of Culture: The History and Importance of Cultural Inheritance
When, in turn-of-the-20th-century America, Alfred L. Kroeber continued the work of Franz Boas in establishing anthropology as an academic discipline he defined culture as a phenomenon sui generis. The goal was to build a strong opposition against hereditarian thoughts prevalent at that time. He thereby treated culture in close connection to the concept of heredity: culture as opposed to biological heredity (culture as superorganic) and, at the same time, culture as heredity of another sort.
In my paper, I will present Kroeber’s position and analyze which concepts of inheritance influenced him. Weismann and Mendelism will be mentioned, the concepts of hard versus soft inheritance will be introduced, and the significance of Lamarckian inheritance of acquired characteristics for the nature- nurture-debate and racism of that time will be discussed. This will lead to some general conclusions on how to describe the theoretical role of the concept of cultural inheritance. In addition, I will show that evolutionary psychology, by assuming that culture is a mere triggering condition for innately specified modules of the mind, falls back to a 19th century concept of culture, ignoring more than a century of debates on the concept of culture and the importance of cultural inheritance. Last but not least, the short history of the concept of culture presented in the paper will allow me to show that the rise of genetics at the beginning of the 20th century did not have an unambiguous unidirectional historical effect on the vogue of hereditarian thoughts in the US at that time. Although it is certainly true that the rise of genetics furthered hereditarianism, the development also helped cultural anthropologists to formulate their opposition to then prevalent hereditarian thoughts and thereby to establish culture as a phenomenon sui generis – a phenomenon that strongly relies on cultural inheritance, that needs its own academic approach for treatment, namely cultural anthropology, and that is not reducible to nature, innate characteristics, biological evolution etc.
Session: VI.11 Room: Peter Chalk 2.1 Time: Friday 14:30-16:00
plurality of species concepts produces an effect structurally similar to moving between more or less inclusive taxonomic groups. This effect is reflected in the few empirical studies that directly address this problem. These results ought not be surprising. For the appeal to the species category to solve the problem of taxonomic resolution, some agreement on a species concept is required to avoid the problems I have identified here. This, empirically, is not likely to be achieved in the near future and may not even be desirable or biologically warranted. Further, assuming that a single species concept were possible, there is no guarantee that the groups of organisms picked out by systematists would be the best way to group organisms relative to any particular ecological process. However, while different species concepts do lead to a plurality of models, I do not see this as a deficit of systematics or ecology, but rather as a useful tool for producing better science. In the end, I argue that the best strategy for solving the problem is to embrace it. Using different species concepts (or even supra-specific taxonomic categories) to produce multiple models can be a strength of ecology if the relationships between systematics, organismal biology, and the particular ecological processes studied are made explicit.
Ulrich Kutschera
Institute of Biology, University of Kassel, Kassel, Germany
Endosymbiosis and cell evolution: the history of an idea
In 1905, the Russian biologist C. Mereschkowsky (1855 – 1921) postulated that plastids (e.g., chloroplasts) are the evolutionary descendants of endosymbiotic cyanobacteria-like organisms. In 1927, I. Wallin explicitly postulated that mitochondria likewise evolved from once free-living bacteria. Here, I summarize the history of these endosymbiotic concepts to their modern- day derivative, the “serial endosymbiosis theory”, which collectively expound on the origin of eukaryotic cell organelles (plastids, mitochondria) and subsequent endosymbiotic events. In addition, I review recent hypotheses about the origin of the nucleus. Model systems for the study of “endosymbiosis in action” are also described, and the hypothesis that symbiogenesis may contribute to the generation of new species is critically assessed with special reference to the secondary and tertiary endosymbiosis (macroevolution) of unicellular eukaryotic algae.
Session: XIII.8 Room: Peter Chalk 2.6 Time: Sunday 11:00-12:30
Tel Aviv University, Tel Aviv, Israel
Toben Lafrancois1,2
1) St Olaf College, Northfield, Minnesota, United States, 2) St. Croix Watershed Research Station, Marine on Saint Croix, Minnesota, United States
Taxonomic resolution in ecology:
How species concepts produce a plurality of ecological models.
In almost all ecological analyses, individual organisms are sorted into taxonomic groups that constitute the ‘parts’ of that model or analysis. Practical constraints and other considerations that limit choices of taxonomic groups result in what ecologists call ‘the problem of taxonomic resolution.’ This problem is noteworthy since using more or less inclusive taxonomic classes has a variety of effects on the outcomes of those models. Different taxonomic groupings result in a different number of objects in the system, and these objects include different members, and both changes can sometimes produce different results or models. These effects raise questions about which taxonomic level of grouping organisms produces the best model. For a variety of reasons, one appealing solution is species level analysis. I argue that both conceptual and empirical issues foil this solution. In particular, the
Evolution of Networks and Networks of Evolution
Biological systems exhibit many types of behavior and structure that lend themselves to the network perspective: internal networks such as neural nets, gene regulatory nets, protein interaction nets, signaling pathways; and external networks: ecological relationships, epidemiological (infection) networks etc. In recent years the theory of networks became a well defined theory, rooted in mathematics and statistical physics. Partly inspired by these developments, the study of biological networks has become a prominent area of ongoing research. We explore how research on gene regulatory networks and protein interaction networks is related to arguments regarding neutralism, gene shortage, gene additivity and canalization, and how the network perspective differs from the gene- centered view of evolution. We explore how the network perspective can be integrated in a wider post-Synthesis view of evolutionary processes. We concentrate on how networks can be used to explain prevalent properties that are important evolutionarily such as phenotypic modularity and plasticity. We argue that networks and subnetworks should be viewed as developmental units and that research should focus on how network
Session: IV.4 Room: Peter Chalk 1.1 Time: Friday 9:30-11:00
Ehud Lamm
structures change ontogenetically as well as phylogenetically. We discuss how theoretical models of networks can be used to explain and guide empirical observation, and how subnetworks should be studied, and how internal networks should be analyzed given the context of the evolutionary-ecological networks in which organisms live.
mechanistic model of explanation, albeit the notion of mechanism and certain unique forms of explanation used in computational neuroscience would still mean that they employ of mechanistic interlevel explanations unique to neurocognitive explanations and not heretofore discussed in the literature on mechanistic explanation in the neurosciences.
Session: XI.8 Room: Peter Chalk 2.6 Time: Saturday 16:00-17:30
Session: VI.7 Room: Peter Chalk 1.6 Time: Friday 14:30-16:00
Otto Lappi1, Anna-Mari Rusanen2
Brendon Larson
1) Cognitive Science Unit, Department of Psychology, University of Helsinki, Finland, 2) Department of Philosophy, University of Helsinki, Finland
University of Waterloo, Waterloo, Ontario, Canada
Marr’s Computational Level and Mechanistic Explanation – extending the Notion
of Mechanism
It has been suggested by many philosophers of bio- and neurosciences that the mechanistic model of explanation offers a satisfactory model of explanation for the neurosciences. It has been also proposed that the mechanistic model could be extended to cover computational explanations in computational neuroscience as well.
Computational neuroscience attempts to relate computational theories of cognitive – i.e. information- processing - mechanisms to known physiological mechanisms and biophysics of the brain. Thus, models in computational neuroscience aim to integrate different levels of organization. Marr’s levels of computation, algorithms and implementation offer one account of the organization of such multilevel theories.
In our presentation we discuss various definitions of the notion of mechanism in the literature on mechanistic explanation in the bio- and biomedical sciences, and argue that computational explanation in the marrian sense (top-down explanation from the computational level to the algorithmic) employs a subtly different notion of mechanism from that operating in mechanistic explanation in the biosciences.
Towards An Ethics of Biological Metaphor: The Case Of Promotional Metaphors
Metaphors play a crucial role in guiding biological inquiry, sometimes even constituting it. Since biological metaphors derive from everyday sources they also allow scientists to describe their ideas and findings to others
– including colleagues, scientists in other disciplines, and non-scientists. This communicative capacity weds the epistemological and rhetorical functions of metaphor: if scientists cannot explain their ideas so that others can understand them then those ideas will not be embraced (by their colleagues, funding agencies, or the public). Therefore, metaphors will be chosen for their emotive and rhetorical benefits as well as their cognitive and epistemic ones. And since scientists and non-scientists live in a similar cultural context, well- chosen metaphors tend to appeal to both audiences.
In this presentation, I introduce the concept of “global metaphor” to characterize large-scale constitutive metaphors of this sort, which both constitute an area of biological inquiry and resonate more widely in society. As a case study, I will extend Dorothy Nelkin’s work on “promotional metaphors” by presenting the results of interviews with influential biologists who have recently introduced metaphors, including “meltdown” and “barcoding,” which have successfully promoted their research in both scientific and popular circles. A key question I wish to explore is whether we can extend Philip Kitcher’s “responsible biology” to the process whereby scientists introduce and adopt particular metaphors.
We then propose extending the notion of mechanism to cover computational explanation in Marr`s sense by recognizing what we call abstract mechanisms as bona fide mechanisms. We argue that this notion of explanation by means of abstract mechanisms is the best philosophical account of how the (marrian) computational approach integrates cognitive and (neuro)biological explanation into single unified neurocognitive explanatory framework.
In this way, computational and cognitive neurosciences – as well as physiological, biomedical and systems neuroscience – could all be covered under the
Session: XI.1 Room: Newman B Time: Saturday 16:00-17:30
issue since the age of Hobbes, Bacon and Descartes. Literally and metaphorically, these arguments discover in the organism a freedom alien to modern sensibilities; their revival demands serious attention.
My paper addresses one dimension of this revival: the striking rapprochement between neo-conservative and neo-marxist critiques of eugenics, between figures like Leon Kass (former head of the Presidential Council on Bioethics) and German social theorist Jürgen Habermas. Both critiques invoke our property interest in bodily integrity to contest the trade of organs and genetic material in a globalized liberal market; both bemoan the displacement of political decisions by economic calculation as the precondition for a liberal eugenics in the first place; and above all, both invoke the autonomy of the biological organism as the foundation for the ethical self-understanding of the (human) species. This paper has three aims: first, to demonstrate how anxieties about a neo-liberal displacement of the grown by the made, of organism by artifact, have fashioned unlikely bedfellows; second, how both camps invoke the autonomy of the organism as such to contest technologies of life that call the autonomy of the human organism into doubt; and third, how both, in doing so, in effect revive Aristotelian views on the nature of the animal soul.
Manfred Laubichler
Arizona State University, Tempe, AZ, United States
Title: Regulatory Gene Networks: Historical and Epistemological Reflections
The Evo Devo synthesis of the last decades focused to no small degree on causal explanations of phenotypic characters, their variations, and their evolutionary transformations. Research into the molecular details of developing systems has uncovered regulatory networks that govern patterns of gene expression. These regulatory gene networks are believed to be responsible for differentiation during development as well as through changes in their structure-for evolutionary transformations. Furthermore the specific architecture of these regulatory gene networks seems to reflect the observed patterns of comparative biology, with different elements of regulatory gene networks showing variable rates of change, thus accounting for the intrinsic conservation of Bauplan features.
In this paper I will trace the history of the discovery of regulatory gene networks and analyze the explanatory role these networks play in current theories of developmental and evolutionary developmental biology. I will then explore how the concept regulatory gene networks can mediate the conflict between genetic and developmental viewpoints, a challenge that Dick Burian has put forward to all of us (Burian 2005).
Session: VI.4 Room: Peter Chalk 1.1 Time: Friday 14:30-16:00
Benjamin Lazier
Reed College, Portland, OR, United States
Environmental information in a Greek forest reserve: Scientific rhetoric and images of nature
In the present paper, we account for the discursive practice of environmental information through a study of the textual material (both verbal and pictorial) displayed in the information center of a Greek reserve (Dadia forest). Information centers attempt to satisfy the ‘green’ dimension of ecotourism by raising the environmental awareness of visitors, and therefore they constitute forms of informal environmental education (Negra & Manning 1997). On the other hand, in the context of tourism studies, information centers participate in the shaping of ‘destination images’ of visitors, namely, in the total of the perceptions and feelings they obtain about protected areas (e.g. Fakeye & Crompton 1991, Bigné et al. 2001).
Through content analysis, we examined ‘cognitive’ (environmental content) and ‘companion meanings’ (metaphors and images of the natural world
Aristotle, Again: Jürgen Habermas,
Leon Kass and the Ethical Self-Understanding of the Species
European thought of the twentieth century has taken as one of its prevailing themes the modern dominance of the artifactual over the natural. My paper explores one reaction to this reign: the revival of teleology, the idea that natural organisms are endowed with will, autonomy and purpose. The concept has origins in Greek cosmology, but was superseded by the rise of an anti- Aristotelian, mechanistic science in early-modern Europe. In recent years, however, it has returned with a vengeance: in discussions regarding organic agriculture and intelligent design, for example, but also in more elemental arguments about the organism as a repository of autonomy. These arguments question the equation of human freedom with the fabrication of our world and ourselves, and so overturn the dominant approach to the
Ageliki Lefkaditou1, Anastasia G. Stamou2, Dimitrios Schizas1,
George P. Stamou1
Session: IV.9 Room: Peter Chalk 2.5 Time: Friday 9:30-11:00
1) Aristotle University of Thessaloniki, Thessaloniki, Greece, 2) University of Thessalia, Thessalia, Greece
accompanying environmental content) (östman 1996, Roberts 1998). Focusing also on the linguistic and pictorial forms of texts, we analyzed the kind of rhetoric (‘scientific’ and/or ‘humanities’ one) employed by environmentalist discourse (Veel 1998).
This multi-layered textual analysis reveals that the environmental information diffused through the reserve builds a ‘romantic’ view of nature (e.g. Kwa 2002, Law 2004). Specifically, processes of the natural world were represented as self-contained events, as something ‘out there’ that scientists merely observe and record. Thus, science was depicted as having the role of disclosing the eternal natural laws. In this context, human intervention was understood as a ‘disturbing’ stimulus to natural balance, forcing nature to respond in a certain way, rather than as an acting force that affects nature in a direct way.
Session: X.3 Room: Newman D Time: Saturday 14:00-15:30
“Pythagoras II” funds this research.
The Child That is Wanted: Kinship and the Body of Evidence
I consider in this paper how the ‘own’ child that underpins and justifies reproductive technologies is defined. The idea of the ‘biological’ child is ubiquitous in writings and discussions around reproductive technologies, and it has been pointed out by a number of anthropologists working in this area that the ‘biological’ and the ‘own’ are understood in a range of ways by those involved in reproductive technologies (see, for instance, the seminal work of Marilyn Strathern, Sarah Franklin, Helena Ragoné, Charis Cussins Thompson). However, the work of these anthropologists does not consider how precisely the child that reproductive technologies is supposed to produce is specifically defined, and that is therefore the focus of my paper.
Two areas that are key to ideas of the biological and the own are genetics and anthropological kinship studies. Anthropologist Helena Ragoné quotes as follows: ‘[artificial reproductive technologies] have served to defamiliarize what was once understood to be the “natural” basis of human procreation and relatedness... as the Comaroffs so eloquently said of ethnography, “to make the familiar strange and the strange familiar, all the better to understand them both”’. I mention this quotation to indicate ways in which reproductive technologies are seen to introduce new perspectives on ‘human procreation and relatedness’. It is one clear example of the way that the idea of ‘family’ is seen to be changed by reproductive technologies, and yet the idea of ‘family’ can be argued always to have shifted and changed in history and between societies. I consider in detail the arguments of anthropologist Janet Carsten (After Kinship, CUP, 2004) around how reproductive technologies have changed ideas of kinship in some senses, and preserved them in others. I further bring Carsten’s discussion in relation to the writings on kinship by theorist Judith Butler, as well as linking it to the writing of Janelle Taylor et al. in their book Consuming Motherhood (Rutgers UP, 2004).
Session: II.4 Room: Peter Chalk 1.1
Time: Thursday 14:30-16:00
Sabina Leonelli
London School of Economics, London, United Kingdom
Bio-Ontologies: a New Means of Travel for Biological Facts
Bio-ontologies are often presented as a ‘neutral’ tool for the diffusion of facts about organisms to biologists: that is, as a way to standardise the terminology and relations among terms used to describe biological processes, so that the immense amount of (especially microbiological) data recently accumulated on various aspects of the main model organisms can be brought together and made accessible to the whole biological community. In this paper, I argue that bio-ontologies are not a neutral vehicle for the diffusion of evidence. Rather, they constitute a new type of biological theory, incorporating a specific perspective on biological phenomena, through which data are re-interpreted in order to fit specific research goals. Notably, one of these goals consists of integrating the available knowledge about various aspects of any organisms into an overall understanding of their biology. The main issues that I shall address in this paper are thus the following: how well do biological facts circulate through bio-ontologies? How effective is the use of bio-ontologies towards obtaining integration in biology? And what kind of integration is that – is it actually possible to distinguish it from a kind of theoretical unification? In addressing these questions, I focus on the use of one of the bio- ontologies, the so-called Gene Ontology, to structure and display data about Arabidopsis thaliana within The Arabidopsis Information Resource.
Karin Lesnik-Oberstein
University of Reading, Reading, United Kingdom
Session: III.7 Room: Peter Chalk 1.6 Time: Thursday 16:30-18:00
Session: VIII.11 Room: Peter Chalk 2.1 Time: Saturday 9:00-10:30
Bert Leuridan
George Levit
Ghent University, Ghent, Belgium
Friedrich-Schiller-Universität, Jena, Germany
Can Mechanisms Replace Laws of Nature?
The concept of ‘mechanism’ (interpreted as a complex system, cf. [1]) has gained much popularity in philosophy of the life sciences. Large part of the mechanistic literature is motivated by the finding that genuine laws of nature (satisfying the traditional criteria for lawfulness) are seldomly found in the life sciences. Mechanisms and mechanistic explanation are put forward to replace laws of nature and nomological explanation.
Over the past fifteen years, however, the traditional criteria for lawfulness have also been attacked from another side. Sandra Mitchell retains laws of nature, but proposes to interpret them pragmatically, in a multi- dimensional, non-dichotomic framework (cf. [2], [3]).
It remains an open question what is the precise relation between both concepts, ‘complex-systems mechanism’ on the one hand and ‘pragmatic law’ (P-law, for short) on the other hand. I will argue that mechanisms cannot replace laws of nature. My line of reasoning will both be conceptual and methodological.
The concept of mechanism presupposes the existence of P-laws (for a system to be counted as a mechanism, both its overall behavior as well as that of its parts must to a certain extent be P-lawful). The concept of P- law, by contrast, does not presuppose the existence of underlying mechanisms (nothing precludes a P-law to be fundamental either in a relative or an absolute sense). So P-laws are conceptually prior to mechanisms. Furthermore, although mechanistic evidence may greatly enhance our confidence in some P-law, other means may do as well: experiments, randomized experiments, prospective (and retrospective) studies. What’s more, these means often have more epistemic weight than mechanistic evidence in the life sciences, as is evident from the IARC’s evaluation procedure for carcinogenicity (cf. [4]).
REFERENCES
[1] Machamer, et al. (2000).”Thinking about mechanisms”. Philos. Sci., 67:1-25.
[2] Mitchell (1997). “Pragmatic laws”. Philos. Sci., 64 (proc): S468-S479.
[3] Mitchell (2000). “Dimension of scientific law”. Philos. Sci., 67(4): 242-265.
[4] IARC (2006). http://monographs.iarc.fr/ENG/Preamble/CurrentPreamble.pdf
Global microbiology: One More Step to a “New Synthesis”?
Although the rudiments of the biosphere concept can be found already in Plato’s Timaeus and in modern times are to be found in James Hutton’s (1726-1797) ‘super- organism’ or Lorenz Oken’s (1779-1851) Weltorganismus, the first scientist to use the term ‘biosphere’ in the modern sense was the Austrian geologist Eduard Suess (1831-1914). The first scientific theory of the biosphere was systematically developed by the Russian Vladimir Vernadsky (1863-1945), who is at the same time regarded as the founder of biogeochemistry (the very term bio- geochemistry was coined by Vernadsky). Comparable global approaches were also developed by the French palaeontologist and Jesuit Father P. Teilhard de Chardin (1881-1955), and Russian morphologist Vladimir Beklemishev (1890-1962). At present, apart from the classical biogeochemical studies, various versions of the Biosphere theory are supported by the Russian microbiologist Georgi A. Zavarzin, the English inventor James Lovelock, the German geomicrobiologist Wolfgang E. Krumbein and the American microbiologist Lynn Margulis.
The idea in its most general form interprets the Biosphere (the total sum of living organisms with their environment) as a dynamic, self-regulating system evolving in accord with the own laws. Already Vernadsky pointed out that all biogeochemical functions can be carried out by uni- cellular organisms. The prokaryotes have the fundamental role in the running of biogeochemical cycles of the biosphere. Subsequent evolution had an additive character and complementary organisms were incorporated into the system in balance with the lower level phenomena. This was fully realized already by some “co-architects” of the Synthesis, as can be exemplified by the later (and almost unknown) works of I. I. Schmalhausen (1884-1963), who became influenced by the ideas of Vernadsky. Subsequent formulations of the biosphere theory suggest an approach diametrically opposed to that of the classical Evolutionary Synthesis. Instead of bottom-up causation (DNA- changes_proteins_phenotypes_natural selection in populations_species_ecosystems) we have a situation, in which the biosphere causes and controls events at the structurally lower levels. Can this global, microbially based approach implying downwards causation co-exist peacefully with the Synthesis? Are the external con- straints canalizing the population level processes the real threat to the monopoly of natural selection as the ‘only direction-giving factor’ (Mayr)? Do we need another “Synthesis”?
Session: X.4 Room: Peter Chalk 1.1 Time: Saturday 14:00-15:30
ought not, accept that there are informational ‘things’). At the same time, we can accept the attitude, widespread among biologists, that informational notions are a highly useful tool in understanding biological phenomena.
Arnon Levy
Harvard University, Somerville, United States
Biological Information as an Explanatory Metaphor
Much of the philosophical discussion surrounding the notion of information in biology has been concerned with specifying criteria of application – truth conditions – for informational terms, especially to processes at the cellular level. Underlying the debate has been an assumption that if such criteria cannot be coherently articulated, then ‘information’ and its cognates cannot play an epistemic or explanatory role in cellular and developmental biology.
Here I wish to challenge this assumption and suggest that we may understand the cognitive role of information in biology as akin to that of a metaphor. While not being a literal description of biological reality, informational metaphors might still serve a substantive cognitive role in allowing biologists to reason about and communicate important explanatory facts. I appeal to Kendal Walton’s well-known notion of ‘prop-oriented make believe’ to articulate this view. We often employ objects, props as it were, in connection with games of make-believe. Walton contrasts two aims we might have in such an activity. Sometimes we employ the prop in the service of the make-believe, as when a child treats a doll as her imaginary friend. But sometimes it is the other way around: we make-believe in order to think and communicate about the prop, as when we explore the geography of Italy in terms of where various locations are on the “Italian boot”. Invocations of information in biology, I wish to suggest, work in much the same way: they call to mind a fiction in which some elements of cellular (or developmental) processes are treated as if they were information sources, some as signals and others as receivers. This fiction provides a compact structure within which one can reason and communicate about core causal features of the processes in question – primarily, it isolates their directionality and the manner in which they function to correlate variation in one system to variation in another system. In several important respects, the point of prop-oriented make-believe is akin to that of models, construed as abstract constructions which represent real-world systems indirectly (this understanding of models is closely related to the views of Ron Geire, Peter Godfrey- Smith and Michael Weisberg). Such a conception allows us to affirm the cognitive role of informational notions while denying that information is ‘real’. It thus allows one to heed the lessons of philosophical criticisms of the literal applicability of the notion of information – especially the parity thesis (we need not, indeed we
Room: Peter Chalk 2.3
Forces and Causes, Probabilities and Populations: Clarifying the Metaphysics of Selection
Early articles defending the ‘statistical’ interpretation of natural selection against the ‘dynamical’ one have not always been careful enough to untangle a range of quite different issues. The main questions that need to be distinguished include:
i) Can natural selection and drift can be said to act on individual organisms or on populations,
ii) Can natural selection and drift can be compared in their influence in a way that is analogous to the comparison of forces,
iii) Are natural selection explanations causal,
iv) Do natural selection and drift explanations invoke objective probabilities that lie between zero and one, and
v) Are these probabilities properties of individuals or populations?
My main goal in this paper is to show that these questions are largely independent of each other, and in this respect I am following the lead of Roberta Millstein, Chris Stephens and Elliott Sober. But I will also offer some brief suggestions regarding the answers to the first four questions.
Session: IV.10
Time: Friday 9:30-11:00
Tim Lewens
University of Cambridge, Cambridge, United Kingdom
Session: V.3 Room: Newman D Time: Friday 11:30-13:00
Gesa Lindemann
TU Berlin, Berlin, Germany
Neuronal expressivity: A new technology of innocence
In the talk I shall unfold the argument that neuroscience is less a science of the single brain, but a new technology of understanding the other. The neurobiological sciences measure and visualize the activities of the brain. The techniques are EEG, fmri, invasive electrophysiology – just to name a few. The
practices of visualizing and interpreting neuronal activities are based on a peculiar working concept, which can be described as neuronal innocence. I.e., neuronal activity is considered to be spontaneous and uncontrollable by the organism. Based on this assumption, experimental neuroscience offers a technology of understanding the other as she/he truly is. The operating principles of this understanding will be described in a first step – based on ethnographical observations of neuroscientific-research practices.
Session: VIII.10 Room: Peter Chalk 2.3 Time: Saturday 9:00-10:30
In a second step the possible social consequences of neuronal innocence will be discussed. Neuroscientists offer their technology as a pathway to what the (human) organism really feels and wants, i.e., a technology of authenticity and truth telling. The more such measuring devices will be minimized the more they will invade everyday life. The Christian God could look into the hearts; with a little help from neuroscience we will be able to looks into each other’s brain. Perhaps we are close to the paradise of social relationships unmarred by lying and falseness. It is not hard to imagine, where such paradise-technologies will be adopted: in front of the court or in situations of emotional confessions – between lovers for example, etc. Then it will become obvious, what is actually hidden in the laboratory: Neuroscience is not concerned with single brains, but with producing new technical expressive surfaces (the visualization of actual brain activity) and with producing peculiar new modes of understanding the technical expressivity of the other.
Bias in Evolutionary Explanations of Women’s Orgasm
I shall present evidence for two sorts of bias in evolutionary explanations of female orgasm, adaptationist and androcentric. Adaptationist bias arises when researchers assume that female orgasm arose because it contributes to the reproductive success of women, without evidence supporting that assumption. Upon careful examination, none of the adaptive explanations have the kinds of supporting evidence required under normal evolutionary standards. Androcentric bias arises in this case when female orgasm is assumed to occur with intercourse, as male orgasm does. Briefly, female sexuality is pictured as identical to male sexuality. But this conflicts with our best sex research. I shall review the harm done to evolutionary explanations by both biases.
Elisabeth Lloyd
Indiana University, Bloomington, IN, United States
Session: VI.1 Room: Newman B Time: Friday 14:30-16:00
Session: V.8 Room: Peter Chalk 2.6 Time: Friday 11:30-13:00
From the “DNA is a Program”, a Misleading Model and Metaphor in Molecular Biology, Toward the Role of Randomness and Extended Criticality of Living Entities
Since the 1930s and, more specifically, the 1950s, molecular processes in cellular reproduction have been often analyzed by the “the DNA is a program” metaphor or even model, in the physico-mathematical sense. Schroedinger contributed to this understanding with his 1944 book, but he lucidly hinted to the intended causal structure underlying the paradigm. Subsequent work by Monod, Lwoff and Jacob opened the way to a larger use of this model or metaphor, as well as to a Laplacian understanding of biological causality and randomness. After a short survey on the structures of determination, from Laplace to modern dynamics, we argue that both the metaphor and the model are causally inadequate, in particular when derived from the current empirical practices in Molecular Biology, based on the “differential method” (a mutation is observed or induced and its phenotypic consequences are observed). While referring to empirical evidence and theoretical writings in Biology, we will argue that the programming/deductive paradigm is theoretically unsound as causal and deductive frame for relating the genome to the phenotype, even from the point of view of
Stefan Linquist
University of Queensland, Brisbane, Queensland, Australia
But is it progress? On the alleged advances of conservation biology over ecology
A growing consensus in the philosophy of ecology recognizes conservation biology as a distinct discipline from ecology. In addition, it is argued that the respects in which conservation biology has diverged from ecology mark important advances towards its aim of preserving biodiversity. Here I critique three arguments in favor of this thesis: (1) since ecology has failed to identify universal principles or “laws”, conservationists are epistemically justified in adopting commonsense heuristics in their place; (2) by incorporating elements from the social sciences conservation biology has made important advances over ecology;(3) conservation biology has developed a distinctive place-based ontology that focuses on token entities which is superior to the type-based ontology offered by ecology. As I will show all three proposals are problematic.
Giuseppe Longo
Ecole Normale Superieure, Paris, France
Physics and Programming. This is in contrast to the physicalist and computational grounds that this paradigm pretends to propose.
Two different notions of randomness as unpredictability, in Classical and Quantum Physics, will be briefly surveyed. We will hint to the need of a third concept of randomness that the unpredictability of the ecosystem seems to require in Biology. Finally, the familiar physical notion of “self-organized criticality” will be “extended” to an understanding of the long- lasting coherence structures of life, as “extended criticality”, a conceptual frame for the analysis of the physical singularity of life.
(This is a collaborative work with P.-E. Tendero and F. Bailly)
REFERENCES (see: http://www.di.ens.fr/users/longo )
Bailly F., Longo G. Mathématiques et sciences de la nature. La singularité physique du vivant. Hermann, Paris, 2006 (Introduction in English downloadable)
Session: XIII.8 Room: Peter Chalk 2.6 Time: Sunday 11:00-12:30
Françoise Longy1,2
1) Institut d’Histoire et de Philosophy des Sciences, Paris, France, 2) Université Marc Bloch, Strasbourg, France
Function as an Overarching Concept
The etiological theory of function proposed by Larry Wright in 1973 had some flaws. The subsequent versions of the theory that Millikan, Neander and others proposed in the 80s corrected these flaws by rooting the notion of function in biology and introducing explicitly natural selection in its definition. A scientifically acceptable concept was thus obtained, but its domain of application was quite limited– it could be applied literally only in biology – and its scientific interest was almost nil: it was synonymous or almost synonymous to “being an adaptation” (depending on which version of the theory you considered). First, I will argue that we need a more abstract notion of function in order (1) to account for the continuity between biological and artefactual functions and the half-artefactual, half- natural functions of bio-artefacts; (2) to justify the explanatory role functions have in biology and in other sciences. Second, I will set a list of desiderata that the new notion should satisfy and indicate how these could be fulfilled by introducing probability and real kinds (a natural kinds is a particular sort of real kind) in an etiological definition of function.
Session: IX.3 Room: Newman D Time: Saturday 11:00-12:30
Clement Loo
University of Cincinnati, Cincinnati, United States
Session: II.11 Room: Peter Chalk 2.1 Time: Thursday 14:30-16:00
What is Natural?
Global warming, habitat protection, and invader species are at the moment hotly debated topics in the public arena. These debates revolve around various understandings of what is “natural” and differing views regarding what we should do to bring such a state about. Such being so, it seems that we should tease out the concepts underlying the notion that there is some state that the world ought to be in and that we should endeavor to preserve. Review of the work of people such as J. Baird Callicott, Holmes Rolston III, Thomas Heyd, and Kate Soper also suggests that our notion of what is natural requires further consideration. While all the listed writers have presented views regarding how nature and wilderness should be understood, there is little consensus between them. Given that there is little agreement to be found in the current literature regarding what is natural, I will in my paper attempt to propose some means to adjudicate between the various positions. In doing this I will argue that our conception of natural rather than being considered as an ontological question aimed at identifying some particular privileged state of nature should be instead regarded pragmatically in terms of various human goals. Goals that I will spell out drawing from the work of the writers mentioned above.
Marie-Claude Lorne
IHPST, Paris, France
Positional Information and Parity Thesis
The claim that genes would play a privileged causal role in development and evolution has been severely criticized by DST proponents. One way to argue in favour of the idea that genes have a greater importance than other factors causally involved in development consists in maintaining that only genes are information carriers. This property would explain that they control and organize ontogenetic processes, the other factors being only subordinate resources. DST proponents have answered to this view with the ‘parity thesis’: according to them, no analysis of the notion of information has the potential to isolate genes as the privileged cause of development, because any analysis applying to genes will equally apply to other non genetic causes of development.
I will try to evaluate the scope and validity of the parity thesis. I agree with the idea that if an informational
characterization of some intracellular processes is accepted, there is no reason to apply to notion of information to genes only. In this respect, the fact that developmental biologists admit the notion of positional information is interesting. I will attempt to evaluate the parity thesis by examining if it is correct to claim that a concentration gradient of some protein carries information in the same sense as a DNA sequence carries information about a protein. If it appears that both cases are not instantiations of the same notion of information, the parity thesis will be shown to be problematic.
Time: Thursday 11:30-13:00
would claim—homology concerns identity of structure regardless of function. I argue that despite many problematic uses of functional homology, there is a legitimate concept of ‘homology of function’. This concept is articulated through attention to existing definitions and criteria utilized in structural assessments of homology (e.g. the comparative anatomy of skeletal elements). I also discuss the introduction of a novel criterion for judging function homologues, ‘organization’, which focuses on hierarchically interconnected interdependencies among functions. My account has multiple, interrelated philosophical consequences. (1) Homology of function requires “function” to be understood in terms of causal role rather than as selected effect. (2) Philosophers who see selected effect function as primary effectively ignore large portions of biological research. (3) As a consequence of (2), a significant gap exists in our understanding of reasoning practices and explanatory strategies involved in domains of structural biology (e.g. functional morphology). (4) Closing this gap via homology of function is directly relevant to a more perspicuous analysis of functional characters such as behavior or psychological categories. (5) Homology of function also highlights how evolutionary research is executed without the invocation of natural selection while generating reciprocally supporting explanatory claims.
Session: I.5 Room: Peter Chalk 1.3
Ludovica Lorusso1, Giovanni Boniolo1,2
1) University of Padova, Padova, Italy,
2) European School of Molecular Medicine, Milano, Italy
Clustering Humans: Boundaries & Properties
In human genetics, tracing boundaries and building clusters among humans seems to be an important goal. But, what properties are involved in drawing these boundaries? In the talk I shall discuss the weakness of the epistemological and methodological debate about biological boundaries and biological clusters inside medical and population genetics. I shall propose an explication of the concept of boundary in human genetics, in which boundaries and therefore clusters among humans are property-laden. I shall inquire what these properties are and why some boundaries have no justification in a scientific context. The implicit message of this position will be that speaking in terms of races in the field of human genetics can be misleading. To analyze the epistemology and the methodology of tracing boundaries among humans means to clarify the specific contexts in which these boundaries make sense and this is a necessary step in order to avoid the ineffective use of vague terms like “race” and “ethnic group”.
Session: VII.6
Room: Peter Chalk 1.5
Time: Friday 16:30-18:00
Room: Peter Chalk 1.3
Functional Homology and Homology of Function
“Functional homology” appears regularly in different areas of biological research. For example, in molecular developmental genetics the conserved roles of homeobox genes in axial patterning are referred to as functional homologues. And yet functional homology is a contradiction in terms, or so many structural biologists
The Emergence Of Animal Law – On Institutional Conditions Of Research In Life Sciences In Germany, The USA And Japan
Animal law has become one of the significant issues for the study of life sciences during the last decades. A growing concern for the well-being of other living beings has led to controversies on the legal principle defining the human-animal relationship, because the modern law is traditionally based on the anthropological dichotomy of living human beings and other entities.
The current area of public interest for animal welfare encompasses such serious political matters as, for instance, which kinds of legal restrictions should be applied to the activities of life sciences. Especially the use and treatment of experimental animals in biomedical research has been one of the most intensively contested subjects. As a consequence, the practice of life sciences is increasingly regulated by law. Researchers are faced with obstacles deriving from codified rules to protect animals. It is legally enforced to
Session: VI.5
Time: Friday 14:30-16:00
Alan Love
University of Minnesota, Minneapolis, United States
Nico Luedtke, Hironori Matsuzaki
Technical University Berlin, Berlin, Germany
see that animals are not merely objects but living beings which can perceive pain and have psychological distress.
From a sociological standpoint, we analyze the historical formation of animal law which is exemplified in the process of its implementation in Germany, the USA and Japan. One of the main focal points will be the question by what means legislative and juridical agencies exert their long-term influence on science as a whole, and how they shape way of doing research. For a comprehensive analysis of this correlation, it is inevitable to reflect upon a wide range of institutional settings. Therefore, specific ethical, economic and social conditions observed in each national contexts need to be taken into account as well as various kinds of legal culture. In our opinion, intercultural approach enables critical insights as to the social interaction between practices of science and legislative politics. While recent statute regulations set up certain frames to ensure animal care in scientific procedures, the definition of animal welfare often relies on rational knowledge gained from related sciences.
Our core interest is to show how institutional conditions substantially affect scientific research, and to provide persuasive arguments for the contemporary dynamics that connects those different social fields. We propose that anthropology of life sciences should include the study of legal practices and discourses, in order to understand scientific practices.
This research is part of the currently running DFG- project “Consciousness and Anthropological Difference. A Comparative Study on Experimental Brain and Cognitive Research in Human Persons and Animals”.
appear to be poorly motivated. As a consequence of this, we need to be more careful in drawing conclusions from the fact that each interpretation fails for just one particular area of evolutionary theory. I will also argue that a plurality of interpretations is much better than no interpretation at all.
On a more positive note, I will outline a particular way that a plurality of objective and subjective interpretations of probability can be strung together which provides a suitable understanding of the role probability plays in evolutionary theory.
Session: XII.10 Room: Peter Chalk 2.3 Time: Sunday 9:00-10:30
Pamela Lyon, Jon Opie
Session: XII.9 Room: Peter Chalk 2.5 Time: Sunday 9:00-10:30
Prolegomena For A Cognitive Biology
In general, there are two ways to approach cognition. One is to start with the features of the human case and try to generalize to other species. Another is to start with the biological conditions under which natural cognition evolved and currently operates and ask what organisms do such that they might require cognition. A full account of cognition requires both. Cognitive biology, however, requires a biogenic approach. Tight integration with biological knowledge places strong constraints on cognitive explanation. These constraints arise from the fact that cognition evolved in a particular context with very special features. All organisms are complex, self-organizing, dynamical systems that exist far from thermodynamic equilibrium and actively maintain themselves in this statistically improbable state by continuously manufacturing the components of the processes that sustain them. Organisms thus must interact with the world in ways that allow them to actively secure matter and energy. They are also persistence-valuing systems: most organisms have mechanisms for resisting or avoiding perturbations that threaten their integrity. A biogenic approach thus stresses the role of mechanisms that facilitate system persistence, for example, those that integrate information concerning external and internal states of affairs to facilitate adaptive behaviour; differentiate some states of affairs from others; invest different properties of the environment with different degrees of salience; appraise system needs relative to prevailing conditions, the potential for interaction, and whether the current interaction is succeeding (or not); and reduce the impact of random perturbations on system functioning, of which there are many, potentially lethal sources. These are recognisably cognitive functions. Mounting evidence suggests that even bacteria grapple with problems long familiar to cognitive scientists, including: integrating information from multiple
Aidan Lyon
Australian National Univeristy, Canberra, ACT, Australia
Probability in Evolutionary Theory
Evolutionary theory is up to its neck in probability. For example, probability can be found in our understanding of mutation events, drift, fitness, coalescence, and macroevolution.
Some authors have attempted to provide a unified realist interpretation of these probabilities. Or, when that has not worked, some have decided that this means there is no interpretation available at all, defending a `no theory’ theory of probability in evolution. I will argue that when we look closely at the various probabilistic concepts in evolutionary theory, then attempts to provide a unified interpretation of all these applications of probability
University of Adelaide, Adelaide SA, Australia
sensory channels to marshal an effective response to fluctuating conditions; making decisions under conditions of uncertainty; communicating with conspecifics and others (honestly and deceptively); and coordinating collective behaviour to increase the chances of survival. Thus a biogenic approach not only justifies the use of very simple biological models to study cognition, it suggests that this is precisely where we ought to look to ascertain the general logic of the function, as well as the (potentially conserved) mechanisms that carry it out.
Session: XII.3 Room: Newman D Time: Sunday 9:00-10:30
Sherrie Lyons
Empire State College, Saratoga Springs, New York, United States
East Meets West: Buddhism, Neuroplasticity and Mirror Neurons: Revisiting Evolutionary Ethics
In November 2005 the Dali Lama gave the keynote address at the annual meeting of the Society for Neuroscience, which is indicative of the serious dialogue that is now occurring between practitioners of the contemplative traditions, particularly Tibetan Buddhism and western scientists. In spite of interest in the 1960s and 70s in mind/body medicine, it never became mainstream. My previous research on the marginal sciences in the Victorian era documents that in the midst of discovery it is often difficult to distinguish what constitutes science from what does not and a variety of factors contribute to the following a particular scientific idea may generate. This paper explores some of the reasons for the intense interest at this moment in history. One of the key findings of current research is the idea of neuroplasticity. The adult brain is not fixed, but can be changed and trained. This has been well known for thousands of years by practitioners of Tibetan Buddhism. Mainstream neuroscientists are now collaborating with them and learning a great deal about neuroplasticity and areas of the brain that are involved in different emotional states. Focusing on research that has been done with “expert” mediators, primarily Buddhist monks, I explore the implications of this research for evolutionary ethics. The work of primatologist Franz deWaal, Antonio Damasio’s research on the role of empathy in rational decision making, and the discovery of mirror neurons, sometimes referred to as the Dali Lama neurons, suggests the possibility of an evolutionary ethics that may finally break free of the many problems that have plagued it. This current research suggests that we are “hard wired” for empathy and kindness and I discuss the role that mediation may play in activating the mirror neuron system,
Session: II.7 Room: Peter Chalk 1.6 Time: Thursday 14:30-16:00
James Maclaurin
University of Otago, Dunedin, New Zealand
Session: V.9 Room: Peter Chalk 2.5 Time: Friday 11:30-13:00
Universal Development
Modern developmental biology demonstrates an important connection between evolution and development based on the mechanics of gene regulation and the nature of evolvability. But much detail is still to be settled. There is debate about the nature of modules and about the effects that plasticity and entrenchment have on modular evolving systems. This paper explores our new understanding of development in light of Universal Darwinism—the project of extending the scope of the theory of natural selection to cover more than just lineages of reproducing organisms. If the theory of natural selection can plausibly be applied to non- biological entities such as cultural artifacts, can we also apply evolutionary developmental theory in such cases?
Eileen Magnello
University College London, London, United Kingdom
The Role of Evolutionary Biology in the Establishment of Mathematical Statistics
Mid-Victorian social and economic statisticians, whose academic disciplines lacked a single unifying theory, developed their statistical methods by borrowing the mathematical tools from astronomers; thus, neither group developed a fully-fledged methodology or a statistical school to promulgate such statistical methods. Whilst the mid-nineteenth century vital statistician, William Farr, redefined statistics as a method of analysis, the statistical methods of these vital and social statisticians emphasised the measurement of averages, which was underpinned by the philosophical tenets of Aristotelian essentialism and determinism.
By the end of the nineteenth century, Charles Darwin’s ideas about species formation (i.e., speciation), natural selection and biological variation, which offered the potential to unify the biological sciences, not only challenged the vital statisticians ideas of the role of statistical variation (or deviations from the mean), but it led to an ideological shift in statistical thinking and to a different way of managing statistical data. This Darwinian realignment of the role of continuous variation prompted a major re-conceptualisation of statistics, which was initiated by Francis Galton, pursued by W.F.R Weldon and transformed largely by the efforts of Karl Pearson when he established and
professionalised the discipline of mathematical statistics in the late nineteenth and early twentieth centuries.
appearance of life and to formulate explanations in their own particular ways: some highlight for instance the centrality of RNA (Gilbert 1986), others the importance of amino acids (Miller 1953), still others the properties of liposomes (Monnard, Deamer 2002). This raises the question as to which explanatory schemes are being mobilized in origins of life research. The goal of this contribution is to clarify these explanatory schemes. I argue that three major explanatory schemes play a central role: (1) chemical processes, to justify the prebiotic synthesis of organic molecules, (2) a principle of prebiotic evolution, to explain the appearance of complex functional organic molecules, and (3) self- organization principles, to account for the formation of primitive forms of structure and organization. To illustrate my claim, I appeal to scientific work in prebiotic chemistry, molecular biology and theoretical biology. I also propose that these three explanatory schemes are used in a hierarchical and successive manner, and are constrained by a set of limit conditions. Furthermore, I discuss how this articulation of explanatory schemes might shed new light on the emergence of life on Earth.
Room: Peter Chalk 1.1
From classical holism to the biosemiotic turn, 1920-1940
Session: VII.4
Time: Friday 16:30-18:00
Riin Magnus
University of Tartu, Tartu, Estonia
My talk introduces classic holistic theories that were formulated as an alternative to the vitalism-mechanism controversy in the 1920s to 1940s. After a brief biographical note on the biophilosophers (e.g. Jan Christiaan Smuts, John Scott Haldane, Adolf Meyer- Abich and Jakob von Uexküll) discussed here I will concentrate on the heterogenity of holistic ideas and their relevance to today’s biophilosophy. The second part of the talk will elaborate the diversity of these theories as exemplified by the holistic notions of organism and/or environment. Smuts, with a military and political background, coined the term holism in the 1920s by using the organism as a holistic model for describing all levels of reality. Extending Smuts’ views of holism, Haldane relates an organism to its enviroment. Meyer-Abich, trained in idealistic morphology, disseminates a synthesis of the ideas of British holists adding also the methodic tool of top-down reasoning for comprehending reality. Unlike these holistic philosophers and relying on his Umweltlehre zoologist and biosemiotician Jakob von Uexküll brings to a focus that holistic reality solely depends on human and animal perception. The objective of the talk is to show that there was no unitary holistic view in these times, but a diversity of holistic ideas that combined distinctive philosophical views from Kantian idealism to dialectical materialism.
Session: III.9 Room: Peter Chalk 2.5 Time: Thursday 16:30-18:00
Session: IV.8 Room: Peter Chalk 2.6 Time: Friday 9:30-11:00
A General Theory of Inheritance and Its Implications
The concept of inheritance plays a very important role in the integration of evolutionary and developmental biology. Inheritance in living systems can be defined in very general terms as any process that causes the reoccurrence of traits in organisms through (at least in part) the similarity-making causal influence that organisms (and their environments) have on other organisms (and their environments). The similarity- making influence can be at the individual level or at the population level, and the reoccurrence can be within a particular generation or across many different generations. The processes responsible for the inheritance of traits do not also necessarily result in the inheritance of differences (between individuals, or groups, etc.). Genetic and nongenetic mechanisms are involved in both the inheritance of traits and the inheritance of differences. In this paper, I develop a general theory of inheritance and I use it to address various fundamental issues in the theory of natural selection, in the theory of cognitive and cultural evolution, and in niche-construction theory.
Christophe Malaterre1,2
1) Université Paris 1 Panthéon Sorbonne, Paris, France, 2) IHPST, Paris, France
Explaining the Origins of Life on Earth: Three Explanatory Schemes and a
Set of Limit Conditions
The origins of life on Earth have been and still are a matter of great controversy. In the last fifty years, much scientific research has aggregated into numerous theoretical frameworks (Popa 2004). Owing to the complexity of the endeavour, these theoretical frameworks tend to focus on some aspects of the
Matteo Mameli
King’s College, Cambridge, United Kingdom
Session: VI.3 Room: Newman D Time: Friday 14:30-16:00
Session: VI.3 Room: Newman D Time: Friday 14:30-16:00
James Marcum
Mónica Maria Márquez Sánchez
Baylor University, Waco, TX, United States
UNAM, Mexico City, Mexico
Horizons for Scientific Practice: Scientific Discovery and Progress
The notion of horizon for scientific practice, representing particularly the experimental and theoretical limits or boundaries within which scientists ply their trade, is introduced to facilitate the analysis of scientific discovery, in contrast to the traditional notion of the logic or context of scientific discovery. Briefly, my contention is that a scientific discovery or an addition of the “what” of a discovery to the corpus of scientific knowledge is a consequence of scientific activity by a community of scientists working within a particular horizon of scientific practice, which includes both the experimental and the theoretical. The discoveries of the clotting factor thrombin, by Alexander Schmidt of Dorpat and Andrew Buchanan of Glasgow, and of the blood thinner heparin, by the Johns Hopkins physiologist William Howell and his student Jay McLean, are used to illustrate the notion of experimental and theoretical horizons. Also, the notion of progressive horizon—in which the practice of scientists intersects dynamically with “the way nature is” and in which discovery is present potentially in every experiment—is introduced to analyze scientific progress, in contrast to the static, traditional notion of progress. The proposed notion of progress in science involves the means by which scientists articulate theories through experimental evidence, provided by the development of experimental design and execution. The result is a closing of a gap, as it were, between the explication of natural phenomena through theories and the modeling of those phenomena through experimental activities, by scientists. In short, for the philosopher of science, the underdetermination of a scientific theory by experimental data and observations lessens as science progresses. The experiments conducted by Howell and his student Louis Rettger on the enzymatic nature of thrombin are used to illustrate this notion. In conclusion, the notion of horizon for scientific practice helps to address issues surrounding not only scientific discovery, in terms of “what” is discovered, but also scientific progress, in terms of how the “what” of discovery represents advancement in the scientific understanding of the world.
Idealization And Model Organisms
Currently there are significant discussions regarding *model organisms*.These analyses emphasize the *idealized* nature of such organisms in experimental practice. The aim of this paper is to understand this process of idealization. I review a variety of proposals regarding the characteristics of model organisms, including those of Kohler (1994), Harre (2003), Ankeny (2006) and Rheinberger (1997). I argue that an extremely useful way of characterizing the process of idealization is to see it as material simplification, in the context of experimental systems, of a series of sequential entities: (i) natural organisms, (ii) domesticated organisms, (iii) controlled domesticated organisms, and, finally, (iii) standardized and controlled domesticated organisms. In conclusion, I consider some implications of my account of material idealization for the views of Cartwright (1999) and Radder (2006) on idealization and abstraction.
Session: XI.3 Room: Newman D Time: Saturday 16:00-17:30
Eric Martin
University of California, San Diego, San Diego, California, United States
Primordial Soup and the Spice of Life: J.B.S. Haldane Between Holism and Mechanism
The question of life’s origins has become a topic of properly scientific investigation only in the 20th century. What has allowed the entrance of this seemingly fundamental problem into the realm of scientific discourse? In an attempt to take one step towards an answer, I here examine the thinking and context behind J.B.S. Haldane’s 1929 treatise, The Origin of Life, which today stands as a seminal contribution to modern theorizing on the topic. I will argue that Haldane’s adoption of mechanistic explanations was an important philosophical development which licensed a “modern” hypothesis about prebiotic evolution. Moreover, this was not a trivial change in Haldane’s thought, as he had previously subscribed to a philosophy of biology that may have been antithetical to such hypotheses. Yet Haldane’s position and work within F.G. Hopkins’ nascent biochemistry program at Cambridge eventually instilled an understanding of biological systems in terms of mechanisms, and it was with such an explanatory framework in mind that he posited his theory of origins.
Session: VII.9 Room: Peter Chalk 2.5 Time: Friday 16:30-18:00
experience concerning that subject, the answer is that he did not have unquestionable evidence to support his views and to present them to his readers in the way
he did.
Time: Saturday 14:00-15:30
Roberto de Audrade Martins, Juliana Ferreira
State University of Campinas, Sao Paulo, Brazil
Alfred Russel Wallace’s Claims Regarding Spiritualism
Alfred Russel Wallace claimed that Natural Selection could not explain some human physical features, and even less the human intellectual and moral development. A higher intelligence had foreseen the future and prepared man for it. Wallace collected empirical evidences both for showing that Natural Selection does not explain some human features and for supporting the “Theory of Spiritualism”, which in his opinion explained spiritual phenomena. They had the same status as any other phenomenon observed by science, for him. Wallace’s conceptions about man and his involvement with spiritualism have been the subject of many previous studies. We intend to contribute to this debate by analyzing the way he handled empirical evidence to support the existence of spiritual phenomena. In his book On Miracles and Modern Spiritualism, Wallace presented some criteria that, if followed, would warrant a trustworthy investigation of those phenomena. He also pointed out some investigations (such as the experiments of the American chemist Robert Hare and those of the British chemist William Crookes) as irrefutable evidences. Notwithstanding Wallace’s claims, a careful analysis of Hare’s experiments led to the conclusion that his investigations were far from reliable and did not follow so rigid scientific criteria as the naturalist maintained. In the case of Crookes, although his publications seemed to exhibit a careful scientific study of spiritualistic phenomena, his personal notes show that there were severe limitations in the control of his observations and experiments. Wallace took part in some of the séances carried out at Crooke’s home and he could have noticed those flaws. Moreover, Wallace had already been present at some meetings in conditions that could suggest the need for a better control of the situation. A detailed analysis shows that those investigations mentioned as trustworthy did not always follow the criteria that he required for a good investigation about those phenomena. Besides, in some aspects, they seemed to be less reliable as those researches pursued by the London Dialectical Society committee, in which Wallace had taken part. Whether he had some private reasons to accept the reality of spiritual phenomena is not our concern. Our question is whether he had good empirical grounds for claiming the reality of those phenomena, as he did in his book. According to his own criteria and his own personal
Session: X.7
Room: Peter Chalk 1.6
Lilian Pereira Martins1,2
1) Pontifícia Universidade Católica de São Paulo, São Paulo, SP, Brazil, 2) Conselho Nacional de Desenvolvimento Científico e Tecnológico, Brasília, DF, Brazil
Opponents can help: Sturtevant, Morgan and the building of the first chromosome maps
Around 1906, William Bateson, Reginald C. Punnett and Edith Saunders noticed that in some crossings of peas, certain characters were always inherited together contrary to the Mendelian principle of independent segregation and called the phenomena “coupling”. Afterwards they noticed that coupling could be partial. Latter, in 1911, they tried to explain this fact through the “reduplication hypothesis”, which involved processes related to cell division but had no relationship with chromosomes. Through such a hypothesis it was possible to make quantitative predictions. Morgan and his group, who devoted themselves to the development of the Mendelian chromosome theory from 1910–1911 onwards, observed the occurrence of coupling in Drosophila. They interpreted the phenomenon as due to the occurrence of coupled characters in the same chromosome and called it “linkage”. However, at first this was just a qualitative hypothesis. Sturtevant initially applied both the terminology and method of the reduplication hypothesis to calculate the percentage of recombinants in Drosophila. Only later (from 1913 onwards) he arrived to the idea of chromosome maps. The idea did not appear fully grown and perfect from Sturtevant’s brain, but underwent successive changes, taking into account suggestions and criticism from their opponents. The aim of this communication is to discuss the development of the chromosome maps by Sturtevant and Morgan’s group, in the period from 1911-1919, considering their achievements and the difficulties they faced, and taking into account the contributions of people such as William Castle, A. H. Trow and Bateson, who had a different view about the subject.
Session: V.7 Room: Peter Chalk 1.6 Time: Friday 11:30-13:00
causal claim, if not as a way of dodging the label of ‘genetic determinism’.)
5) Therefore, I am suspicious if there really is a middle ground that can eschew the commitment to metaphysical and monistic realism Waters sees in Sober and at the same time claim to latch onto a kind of realism —- ‘tempered realism’ —- without falling into an extreme conventionalism he wants to eschew too.
Shunkichi Matsumoto
Tokai University, Kanagawa, Japan
Evaluating The Debate On Genic Selectionism: Based On The Heterozygote Superiority Case
In this talk, I will present my analysis of the long- standing debate on the legitimacy of ‘genic selectionism’ (originally espoused by G. C. Williams and Richard Dawkins in the larger framework of the units of selection problem), mainly based on the heterozygote superiority case put forward by Elliott Sober (and Richard Lewontin) to counter those genic selectionists, and on the subsequent exchanges continued until recently joined by such prominent figures as Kim Sterelny & Philip Kitcher, Elisabeth Lloyd, and Ken Waters. I cannot pronounce a final verdict on this debate, but the following are some of the points of argument that represent my basic stance toward this issue.
1) Sober’s criterion for units of selection by context sensibility should be abandoned eventually. That is, we cannot assume that an entity cannot qualify as a unit of selection just because its fitness value shows context- dependency.
2) Sober’s principle of causal uniformity rests on the conception of what he calls a group-level (property) causality distinguished from an individual-level (token) causality and hence population thinking is already embedded in his way of formulating the issue. Therefore, criticism in Sterelny & Kitcher (1988) to the effect that Sober’s principle runs counter to statistical consideration is off target.
3) I agree with Waters that the ‘bookkeeping’ role of the genic representation is unquestionable and hence that the real locus of problem resides not in the representational correctness of genic models per se, but in whether those models can truly capture the causal interactions going on in nature.
Session: VII.5 Room: Peter Chalk 1.3 Time: Friday 16:30-18:00
4) From that respect, Waters’ genic-selectionistic presentation of heterosis case —- even granted that it is an ad hominem tactic to counter monists —-, still leaves it unclear, as Lloyd claims, how it can explain the non- derivative causal efficacy of the individual allele responsible for the evolution of heterosis, beyond mere redefining or renaming the causal efficacy at the level of diploid genotypes as that of single alleles. To be sure, we can grant a status of causal origin to an allelic change if we allow transitivity in the chain of causal influences, but that tactic may make the causal claim concerned somewhat vacuous. (In that sense, ‘genes as difference maker’ tactic is not very meaningful as a
John Matthewson
Mohan Matthen
University of Toronto, Toronto, Canada
Cognitive Kinds and Homology
Functionalists claim that cognitive kinds such as perception and rationality should be defined in terms of selected effects. But they are rarely forthcoming about the specific character of these selected effects. For example: in terms of what effects of the colour vision system should this faculty be defined? Some functionalists are silent about this because they think that the answer is obvious: colour vision is selected for its ability to make information about colour available to the organism. Unfortunately, however, this assumption is wrong; in fact, most colour properties are available to organisms that lack colour vision proper (be that understood as dichromacy or trichromacy). Repairing this defect in functionalist definitions is not straightforward: it requires us to abandon the view that colour vision is a faculty that can be understood in isolation, and to study the hierarchical relationships between parts of the visual system, and to investigate how colour-detecting cells contribute to visual functions generally. In this paper, I shall show how such a hierarchical investigation of colour vision (and other cognitive kinds) invokes notions of homology, both historical and of other sorts.
Session: X.5 Room: Peter Chalk 1.3 Time: Saturday 14:00-15:30
Australian National University, Canberra, ACT, Australia
Modeling trade-offs and scientific explanation.
It is often argued that the generality of a model has an impact on its explanatory power. However, even if we grant this, it is by no means clear that generality is the sole or even most important explanatory desideratum in science. For example, the causal history of a phenomenon is thought by many to be an essential part of its explanation. The idea that we have multiple
desiderata for what makes a model highly explanatory is unproblematic if scientists are able to produce models that maximise both (or all) of these desiderata. However, there is a real question regarding whether we can have such “brute force” explanatory models. In Richard Levins’ The strategy of model building in population biology and subsequent work by other authors, it has been argued that the generality of a model exhibits a trade-off against other modeling properties. If any of the properties that generality trades-off against are also explanatory desiderata, then there may be limitations on how explanatory any single model can be. I will examine this hypothesis and discuss what repercussions the results may have on modeling practice in general and in population biology in particular.
her explication of a “standard of rationality” provides a more nuanced picture of this disagreement. The framing of this debate as a rational disagreement shows that neither community has presented (nor can present?) sufficient evidence or argumentation to win the controversy over which algorithm is better for reconstructing the tree of life.
Fabrizzio McManus
National Autonomous University of Mexico, Mexico City, Mexico
Rational Disagreement In Phylogenetics: Maximum Parsimony or Maximum Likelihood?
I analyze recent phylogenetic discussions in light of Thomas Kuhn’s and Helen Longino’s respective accounts of “rational disagreement” and “contextual empiricism.” My philosophical motivations are to investigate the role of controversy in science and to clarify the relationship between classification and the evolutionary process. An ongoing debate in phylogenetic systematics exists between two methodologies for reconstructing phylogenies: maximum parsimony (cladism) vs. maximum likelihood. Proponents of the former work within a Popperian corroboratory framework which seeks to minimize the number of ad hoc hypotheses (sensu “character state change(s)”), whereas defenders of the latter employ statistical measures for assessing the likelihood of different hypotheses (sensu “phylogenetic trees”). I argue that the tensions between these opposing algorithms are a case of what Kuhn calls “rational disagreements.” According to Kuhn, two communities, or two members of different communities, rationally disagree when they: (1) defend different and locally incompatible (or, at best, mutually irrelevant) methodologies, virtues, goals or theories, (2) agree on most of the methodologies, virtues, goals or theories, and (3) do not violate the standards of rationality endemic to their respective community. Cladists and maximum likelihoodists rationally disagree in that they utilize distinct methodologies and theoretical virtues for attaining the shared goal of constructing a systematic classification of the tree of life. Furthermore, the use of Longino’s account of community individuation as well as
Session: IX.9 Room: Peter Chalk 2.5 Time: Saturday 11:00-12:30
Session: IX.1 Room: Newman B Time: Saturday 11:00-12:30
Lauren McCall
National Evolutionary Synthesis Center, Durham, United States
Session: VII.8 Room: Peter Chalk 2.6 Time: Friday 16:30-18:00
Isolation vs. Diffusion: A cross-cultural test
Cross-cultural research exploits the fact that like biological diversification, cultural diversification simulates natural experiments in evolution. In addition to the Mendelian mixing of genetic and some conserved behavioral traits among migrating populations, the consideration of extragenetic cultural inheritance must additionally take into account the greater flexibility of cultures to share behavioral traits that do not obey Mendelian laws of segregation. In such traits, regions of greater culture contact should be relatively homogenous compared to isolated cultures. This study tests the impact of geographic isolation on the heterogeneity of symbolic classification schemes.
Constantinos Mekios
Stonehill College, Easton, MA, United States
Implications Of Current Applications Of Systems Biology For The Scientific Autonomy Of Biology
Within the past decade, systems biology (SB) has emerged as an ambitious and putatively new approach to the study of the organization and regulation of complex biological systems. Although SB remains largely undefined with respect to its scope, its specific objectives, and the strategy for achieving them, certain of its basic features are clearly discernable when examining its current applications. An immediately identifiable feature of the approach is that it involves multiple disciplines; not only biology and its branches, but also others which are traditionally regarded as non- biological. In particular, SB enlists contributions from areas of inquiry such as computer science, engineering, information science, and mathematics/statistics, in order to address questions about organisms that have remained out of the reach of classical biology. Additionally, SB seeks the formalization and
quantification of the properties of biological systems and uses network representations in order to visualize them. The adoption of this strategy implicitly suggests that the attainment of a more comprehensive (integrated or global) view would facilitate the understanding of the systems studied, thus allowing for satisfactory explanations of their complex structures and functions.
Despite the fact that a lively debate is currently taking place about the scientific attributes of SB and about whether the interest in its recent popularity is justified, so far little attention has been paid to the examination and evaluation of the methodology of SB with respect to its capacity for producing new insights into biological complexity. Even less consideration has been given to the implications of the case of SB for the philosophical debate regarding the character and theoretical foundations of biology. In this paper I argue that a consideration of the current implementation of the systems approach reinforces arguments in favour of the scientific autonomy of biology. Concepts from non- biological disciplines have not yet been properly integrated or sufficiently adjusted so that they could adequately serve the task of studying complex biological systems. Accordingly, biological considerations and concepts remain essential for the materialization of this task because of their explanatory function. Ultimately these observations suggest that SB must retain the aforementioned informal concepts of biology – historical, hierarchical, teleological, or, more generally, causal – in order to fully benefit from an incorporation of the formal concepts of other sciences and from their application to the study of biological systems.
Such a notion of inheritance stresses the role of past generations in the construction of the offspring developmental context, and focuses on the evolutionary relevance of population-environment relations. Any developmental genetic and extra-genetic source contributing, at each generation, to the reconstruction of a life cycle and being part of the explanation of the constancy of developmental patterns across generations counts as an inherited factor. However, DST people do not defend a multiple inheritance system. On the contrary, they reject the distinction between several relatively independent channels of inheritance in causal interaction. Consistently, they refuse dichotomies in the conceptualization of development (e.g. the distinction between nature and nurture), and representations of developmental and evolutionary causation as the mere addition of genetic and extra-genetic causal factors.
I will try to show that it is problematic to consider any resource necessary for development as inherited, and that the “causal parity” thesis, when applied to evolution, is unwarranted. More precisely, I will argue that DST does not provide any evidence for the fact that the evolutionary causal power is neither localised in any privileged factor, nor in many different channels of inheritance, but is diffused in the entire developmental system (i.e. the entire set of developmental interactants). Furthermore, I intend to analyze some problematic aspects of the DST’s expanded concept of inheritance which are strictly correlated, such as the refusal of the notion of reliable transmission and its replacement by the notion of stability. The problem is that DST proponents neglect some important differences concerning the multiple mechanisms which enable stability of heritable developmental resources across generations. In particular, no distinction is made between developmental resources that persist in virtue of the activities of past generations, and resources the stable presence of which is independent and unaffected by past generations (e.g. gravity, sunlight). Finally, I suggest that DST’s interactionist approach does not provide any advantage from an experimental point of view.
Time: Thursday 14:30-16:00
Evolution of individuality during the transition from unicellular to multicellular life
How and why do groups become individuals? These are the central questions motivating our work. We consider the problem of the origin of multicellularity and the transition from groups of undifferentiated cells to groups of differentiated cells specialized at reproductive
Session: IX.3 Room: Newman D Time: Saturday 11:00-12:30
Francesca Merlin
IHPST, Paris, France
DST’s Concept of Expanded Inheritance: Is It Too Expanded?
Developmental Systems Theory (DST) opposes the orthodox vision of evolution as change in gene frequency and, on this basis, reformulates the concept of inheritance from a developmental systems perspective. Starting from the idea that there is no primacy of the genes in development (the “parity thesis”), DST suggests an expanded concept of inheritance, according to which many extra-genetic factors are inherited and evolutionary relevant. Such extended definition applies to “any resource that is reliably present in successive generations, and is part of the explanation of why each generation resembles the last”. According to DST, these developmental resources (or interactants) all together form the developmental system.
Session: II.2
Room: Newman C
Rick Michod
University of Arizona, Tucson, United States
and vegetative (viability enhancing) functions. Our theory predicts that the trade-off between fitness components (viability and reproduction) is a major factor driving this transition. In particular, we predict that the convex curvature of the trade-off selects for specialization and that the curvature shifts from concave to convex as cell-group size increases. We have tested our models in two ways by taking a how and why approach. We have studied the origin of the genetic basis for reproductive altruism in the multicellular Volvox carteri by showing how an altruistic gene may have originated through co-option of a life-history trade-off gene present in a unicellular ancestor. Second, we ask why reproductive altruism and individuality arise only in the larger members of the volvocine group (recognizing that high levels of kinship are present in all volvocine algae groups). Our answer is that the selective pressures leading to reproductive altruism stem from the increasing cost of reproduction with increasing group size which creates a convex curvature of the trade-off function.
and the nearly neutral theory) or are they simply instances of different disciplines using the same evidence to support their respective accounts? Could these claims be seeds for future connections? My presentation will explore the answers to these questions.
Session: I.10 Room: Peter Chalk 2.3 Time: Thursday 11:30-13:00
Is there Nomological Closure in Explanations in Biology?
In this paper, we will defend a specific concept of emergence. Biology deals with the problem of novelty, since nothing in biology makes sense except in the focus of evolution. Emergence is not supervenience. Thinking of emergence as global or mereological supervenience needs to accept the dogma of physicalism which is founded on two assertions that reject the completeness and the causal closure of physical world.
To the contrary, we will argue that the physical world is incomplete and that, if we deal again with downward causation in biology, a lot of problems in the field of origin of life, of aging and of carcinogenesis can be enlightened. And we will show that, since scientific proof in biology is not founded on laws but on models, simulations and experimentations, we reject also the idea of a nomological closure.
Session: VI.1 Room: Newman B Time: Friday 14:30-16:00
Paul-Antoine Miquel
Université de Nice, Nice, France
Roberta L. Millstein
UC Davis, Davis, CA, United States
The Nearly Neutral Theory of Evo-Devo?
Recent work on heat shock protein 90 (HSP 90) by Rutherford and Lindquist (1998) has been included among the pieces of evidence taken to show the essential role of developmental processes in evolution; HSP 90 acts as a buffer against phenotypic variation, allowing genotypic variation to build. When the buffering capacity of HSP 90 is altered (e.g., in nature, by mutation or environmental stress), the genetic variation is “revealed,” manifesting itself as phenotypic variation. This phenomenon raises questions about the genetic variation before and after what I will call a “revelation event”: Is it neutral, nearly neutral, or non- neutral (i.e., strongly deleterious or strongly advantageous)? Moreover, what kinds of evolutionary processes do we take to be at work? Rutherford and Lindquist (1998) focus on the implications of non- neutral variation and selection. Later work by Queitsch, Sangster, and Lindquist (2002) and Sangster, Lindquist, and Queitsch (2004) raises the possibility that HSP 90 buffering may play the role that was played by drift in Sewall Wright’s shifting balance model, permitting transition from one adaptive peak to another. However, Ohta (2002) suggests that much of this variation may be nearly neutral, which in turn, would imply a strong role for drift as well as selection. Do these claims represent true intersections among disciplines (evo-devo, traditional population genetics,
Session: III.8 Room: Peter Chalk 2.6 Time: Thursday 16:30-18:00
Barton Moffatt
University of Minnesota, Twin Cities, Minneapolis, Minnesota, United States
Signaling Processes and Biological Function: An Account of Signal in Cellular Biology
What is a signal in cell biology? This paper sets out an account of signal that captures the meaning of cell biologists when they characterize a biological phenomenon as a signal. The ubiquity of signaling processes in cell biology indicate that they are central to explanations in this field. But, the lack of a satisfactory philosophical analysis of signal suggests that there is value in developing an analysis of it in cell biology. Jablonka (2002) offers brief general account of signal in biology roughly in terms of evolved meaningfulness as part of her general account of biological information, but, I argue that it fails to capture the standard cell biology usage.
The aim of this paper is to provide and account of signal in cell biology in terms of a basic signaling concept and an elucidation of the functional/explanatory context in which it occurs. I argue that cell biologists use the concept of signal to characterize distinct functional roles in biological systems. The important question is not ‘What is a signal?’ but ‘Why do biologists categorize some functions as signaling?’ In addition to characterizing a particular functional role, my account will also analyze the meaning of the concept signal and identify the relevant functional context in which talk of signaling takes place. The basic concept is that a signal carries information between different parts of a given biological system, where information is understood as a causal indication of the state of a system at a given point. This idea is used by cell biologists in the context of a style of functional explanation generally known as ‘causal role function’ in which a mechanism or entity has a function if its behavior explains a contribution to a capacity of interest. I end by showing that my account successfully differentiates between cases of signaling and non-signaling in cell biology by examining an interesting case of signaling by quorum sensing bacteria colonies.
REFERENCES
Jablonka, Eva (2002), “Information: Its Interpretation, Its Inheritance and Its Sharing” Philosophy of Science 69: 578-605.
Time: Saturday 9:00-10:30
The increasing place of macromolecular machines in the descriptions of molecular biologists: What role do they play
in explanations ?
The word ‘machines’ is more and more used by molecular biologists to designate macromolecular complexes having specific functions within the cells. Some discoveries, for instance that the main energy- producing system behaves as a rotor in the mitochondrial membrane, obviously had a trigger effect. But the description of macromolecular complexes as machines has deeper roots in the history of molecular and cell biology. What does this description affords to our understanding of cellular functions ? What is the relation between this concept of machine and the presently very fashionable description in terms
Machines refer to a physical world, whereas macromolecular machines belong to the chemical world. Is the use of the expression ‘machine’ purely metaphoric, or does it correspond to a kind of
physicalization of the chemical world of macro- molecules ? What are the meaning and limits of the replacement of an enzymatic description by a mechanical one ? Machines represent only one category of possible mechanisms. Does the use of the term machine preclude a broader vision of the explanatory role of mechanisms in present-day biology ? Or, on the opposite, is it a first step towards a full recognition of the importance of mechanisms ?
Session: II.6 Room: Peter Chalk 1.5 Time: Thursday 14:30-16:00
Samantha Muka
Florida State University, Tallahassee, FL, United States
Session: VIII.2 Room: Newman C
Syphilis, The Church, and The Body: Disease, Cause, and Treatment in 17thCentury England
Religion and the medical establishment are not seamlessly separated in history. Together with the sickness being experienced, there were also social implications about the disease that had been acquired. Before the discovery and acceptance of the germ theory, disease and recovery were linked to the church and the moral value of the patient. A sexually transmitted disease, which we today call syphilis, was supposedly introduced into the European population at the end of the 15th century. Although it was originally called the French Pocks or the Spanish Sickness, it was renamed syphilis in 1546 after the publication of a poem written by Girolamo Fracastoro. Fracastoro suggested that disease was caused by an agent outside of the body that could be spread in many different ways, most notably through the air. If these views of disease caught on, what social and moral consequences involving the reception of those diseased were produced?This paper seeks to highlight the medical writings pertaining to syphilis in England after Fracastoro’s publication and question how widely and quickly they were disseminated by examining the description of the disease, its origins, and its treatments over the course of the late 16th and 17thcenturies. My suggestion is that the idea of Fracastoro’s disease ‘seeds’ were more quickly embraced than previously imagined and that the idea was accepted for social reasons, most notably the population of ‘good’ women and children that were contracting syphilis at the time.
Michel Morange
Ecole normale supérieure, Paris, France
of modules ?
Session: VIII.9 Room: Peter Chalk 2.5 Time: Saturday 9:00-10:30
argues that the dynamics also can serve as a representation of another ecological model, showing how monocultures can arise even in competitive environments. Finally, we find that the model can explain some features of the organization of the scientific community itself.
Ryan Muldoon, Michael Weisberg
University of Pennsylvania, Philadelphia, PA, United States
Correlating Strategies With Neighbors Even When The Goal is Anti-Correlation
Session: I.2 Room: Newman C Time: Thursday 11:30-13:00
Recent discussions of group selection and the evolution of altruism have emphasized the importance of correlation between individuals. Such correlation is generally thought to involve an individual-based preference for correlation with or mating with other organisms sharing some trait. But can correlation occur in other circumstances as well? In this paper, I argue that correlation can occur even when all individuals are trying to anti-correlate with one another. I will show this with the help of a novel individual-based ecological model, developed jointly with Michael Weisberg.
Consider the following individual-based foraging model. Each organism is represented as a discrete decision maker placed spatially on a torus. In each round of the model, every organism decides on one of n possible foraging strategies. Since each strategy’s energetic payoff is logistically determined, each strategy has diminishing marginal energetic payoff. All agents are fully informed about the shape of the payoffs, including the carrying capacity of each foraging strategy. Because of this diminishing marginal return on foraging strategies, each agent seeks to anti-correlate with the other agents in an effort to maximize its marginal return. What is striking about the model, however, is that under certain conditions, this incentive to anti- correlate results in correlation.
If we add to the base model a maximum foraging distance for all agents, then we can define a neighborhood for each agent – an agent’s neighbors are those that are within its foraging radius. If agents have a maximum foraging distance which covers the entire torus, then foraging strategies are randomly distributed among agents, within the constraints of their carrying capacities. If, however we instead constrain the maximum foraging distance to middle values – 25-75% of the torus – the agents will correlate their foraging strategies with their immediate neighbors. This result is similar to Schelling’s Segregation Dynamic, except that it is arrived at with more counter-intuitive assumptions. While Schelling’s agents had minor preferences for correlation, our agents have preferences for anti-correlation, and find themselves spatially segregated anyway. This dynamic holds for both stationary and mobile agents.
The model also has a wider explanatory capability. Beyond the application to foraging behavior, the paper
Gerd Müller
University of Vienna, Vienna, Austria
Where EvoDevo goes beyond the Modern Synthesis
The emergence of Evolutionary Developmental Biology in the early 1980s was both a response to the essential incompleteness of the synthetic theory, in particular regarding phenotypic evolution, and a result of methodological advances in developmental biology, foremost in developmental genetics. What came to be called “EvoDevo” has evolved into a highly productive discipline and has diversified into several branches of empirical research. Today at least four major programs of investigation can be distinguished, each characterized by specific questions, methods, and goals. These empirical programs have generated a flood of new data and have revolutionized our understanding of how development evolves. It is much less clear in what sense EvoDevo has informed evolutionary theory. Greeted by many as a new paradigm that could lead to a second evolutionary synthesis, it remains to be demonstrated whether such expectations can be fulfilled. The formal integration with the population genetic framework
could prove difficult, but plasticity theory and life history theory harbor promising possibilities. However, the causal-mechanistic approach of EvoDevo also represents a major departure from the population- genetic correspondence paradigm that lies at the core of the synthesis. EvoDevo concentrates on phenotypic evolution and permits predictivenes not only about what is going to be maintained and adaptively varied but also about what is possible to arise under given conditions. This extends the explanatory reach of evolutionary theory beyond the capacities of the framework of the synthesis. Three domains shall be characterized in which EvoDevo provides theoretical innovation: evolvability, emergence, and inherency. Evolvability, based on new results from plasticity, modularity, and integration research, extends our understanding of the lineage specific potentials to produce phenotypic variation. Emergence, addressed through the issues of origination, innovation, and novelty, provides access to the problem of the origin of new phenotypic characters and body plans. And inherency, informed through the work on the generic
physical and epigentic properties of developmental systems, refers to the fact that the inclusion of EvoDevo into evolutionary theory represents a shift of explanatory weight from the external and contingent to the internal and generic. Whereas historical contingency is a key element in the standard neo- Darwinian framework, accounting for the lawful dependence on conditions that involve a large component of chance, inherency is something that will always happen because the potentiality is immanent to the system and can actually only be inhibited. Inherency is the buzz word for the important new focus introduced by EvoDevo in locating the causality of the evolution of morphological form not in external selection and population genetic events, but in the dynamics of interaction between genes, cells, and tissues, each endowed with their own physical and functional properties and dependent upon interactions with the environment. This marks a significant deviation from the argument of the synthesis.
Session: III.5 Room: Peter Chalk 1.3 Time: Thursday 16:30-18:00
Session: VIII.4 Room: Peter Chalk 1.1 Time: Saturday 9:00-10:30
Mapping Global Mobilities: Family Connections and Difference in the Genographic Project
In April 2005 the National Geographic Society launched the Genographic Project ‘a five year effort to understand the human journey where we came from and how we got to where we live today’. It was announced that National Geographic’s ‘Explorer-in-Residence’, geneticist Spencer Wells, would co-ordinate this international project to ‘map humanity’s genetic journey through the ages’ through sampling the genetic material of groups identified as indigenous and genetically isolated. The Genographic Project (GP) has three components: The collection and analysis of blood samples from indigenous populations; the Public Participation and Awareness Campaign that invites ‘the general public’ to participate in the project by paying to have their own genetic material analyzed; and the Genographic Legacy Project which supports ‘education and cultural preservation projects among participating indigenous groups’. Drawing on National Geographic’s sophisticated multi-media entertainment and publicity technologies, and incorporating new commercial tests for ‘deep ancestry’ into its public relations strategy, the GP represents one of the most influential projects within the field of human population genetics. This paper situates the GP within current debates about the scientific validity and political effects of correlating patterns of genetic variation and cultural, ethnic or racial groups, and explores its efforts to avoid the criticisms of the ethics and underlying assumptions of HGDP through an anti-racist liberal celebration of ‘diversity’. I examine the ways in which human physiological difference is produced as a subject of interest that presupposes a natural and politically innocent curiosity about difference that is detached from the history of race and racism. I trace the ways the GP figures prehistoric and contemporary human mobility and immobility to construct a global geography based on distinctions between the mixed and the pure, mobile and rooted; how it reproduces images of primitive exoticism and uses the tropes of heroic scientific exploration in spite of calls by indigenous groups for the GP’s suspension. Finally, I explore the ways in which the figure of the global human family is deployed in the Project’s attempt to depict its work as a contribution to global harmony and understanding. In particular, the celebration of genetic interconnectedness serves to obscure the GP’s focus on genetic distinctions and to naturalise the implicit prioritising of genetic
Staffan Müller-Wille
ESRC Research Centre for Genomics in Society, University of Exeter, Exeter, United Kingdom
Sub-specific variation in the nineteenth century
The late eighteenth and first half of the nineteenth century saw the emergence of heredity as one of the central problems of the life sciences. The problem that heredity came to address, however, was not the constancy of species, but the fluctuating patterns and processes that structure life at the sub-specific level. Phenomena of ‘degeneration’ and ‘spontaneous mutation’ on the one hand, as well as the related phenomena of ‘atavism’ and ‘regression’ on the other caught the attention of naturalists and biologists, and called for explanations in terms of ‘laws’ of heredity and variation. I will discuss three prominent attempts to account for hereditary variation, Charles Darwin’s, Gregor Mendel’s, and Wilhelm Johannsen’s (pre-1900), and argue, that the solutions each of these naturalists came up with were intimately connected with a biologically (and politically) fundamental issue: the issue of the autonomy of the ultimate, constituent parts of living bodies.
Catherine Nash
Queen Mary, University of London, London, United Kingdom
connection as the fundamental basis of social relations within the project and within commercial applications of human population genetics more widely.
Session: VI.1 Room: Newman B Time: Friday 14:30-16:00
Session: V.3 Room: Newman D Time: Friday 11:30-13:00
Nicole Nelson
Middle-out Hierarchical Options in Causation
Conventionally, biologists represent the organisation of living organisms as a series of levels, with genes at the bottom and the organism as a whole at the top. In the reductionist mode, causality is seen to work primarily upwards, so that the organism is a development from its genes. Causality however also runs downwards. In fact this kind of causality is more characteristic of living systems since it depends on the integration of function at various levels. I will argue that in systems with feedback and downward causation there is no privileged level of causation. Modelling may therefore begin, with equal validity, at any level including middle levels. It can then reach out using multi-level engineering principles towards lower and higher levels. I will illustrate these principles using models of the heart. Noble D (2006) The MUSIC of LIFE (OUP)
Denis Noble
Oxford University, Oxford, United Kingdom
Cornell University, Ithaca, NY, United States
Politicizing Methodology: Standardization Debates in Behavior Genetics
This paper will examine the ways in which methodological debates in behavior genetics are inflected with political and ethical concerns. The field of behavior genetics provides excellent opportunities for exploring the connections between science and politics, both because research in this field raises difficult questions about human identity and human equality and because behavior geneticists are particularly attuned to the potential consequences and uses of their research. Using a controversial report published in Science magazine as my starting point, I will explore how debates about standardization in behavior genetics are not only a problem of good science, but also a problem of good politics.
Session: XII.6 Room: Peter Chalk 1.5 Time: Sunday 9:00-10:30
The 1999 report was an attempt by three behavioral genetics researchers to determine whether the results of common experimental protocols could be replicated between laboratories. The researchers found that despite strenuous efforts to equate each laboratory site and experimental protocol, many of the results varied significantly between laboratories and the researchers attributed this variation to effects of the “laboratory environment.” The report was widely discussed both in the scientific literature and the popular press. While popular press characterized the finding as “one for nurture” in the grand struggle of nature against nurture and speculated openly on potential political consequences, the discussion of the report within the scientific literature characterized it as an apolitical problem of producing solid science.
I argue that these methodological debates can also be understood as political, and that political concerns are often framed in terms of the personal responsibility of the researcher not to draw hasty conclusions or to permit careless conceptualizations of the relationship between genes and behavior. The distinction between political and scientific concerns is therefore unclear: the need for caution in interpreting research findings and attention to environmental complexities that the report highlights is both good science for the laboratory and good politics for the public sphere.
Joao Nunes
University of Coimbra, Coimbra, Portugal
The Making Of A Pathogen:
The Early Biography Of Helicobacter Pylori
In July 2005, Barry Marshall and Robin Warren were awarded the Nobel Prize for Medicine for their work on Helicobacter pylori (H.p.). This work was undertaken in the early 1980s, after Warren found bacteria in several sets of specimens of human gastric tissues obtained from biopsies. Despite repeated reports, since the late 19th Century, of similar findings, in the 1950s a standard view of the causes of pathologies such as gastritis or peptic ulcers was established which excluded the possibility of bacterial infection. Bacteria- in-the-stomach were thus treated as non-existent entities, and the bacteria found in biopsies were dismissed as artefacts due to causes such as contamination of the specimens. The collaborative work undertaken by Warren, Marshall and others successively established the existence of bacteria in gastric tissues originating in human patients, the association of the presence of the bacterium with some pathologies of the gastric tract and the fulfilment of Koch’s postulates for the causal link between bacterial infection of the gastric mucosa and gastritis. The outcome was the robust demonstration of the existence of what at first was described as a strain of the bacterium Campylobacter, later recognized as a new
species christened Helicobacter pylori in 1989. Within a decade, H.p. was to become an obligatory passage point (Latour) in the field of gastroenterology. A new consensus over the aetiology of common gastric diseases emerged, and criteria were redefined for the description, classification, diagnosis and treatment of gastroduodenal pathologies. In 1994, H.p.was the first bacterium ever to be defined as a carcinogen. Over the following years, H.p. emerged as one of the most widespread infectious agents worldwide.
The discovery of H.p. was the object of an earlier interpretation by Paul Thagard, who framed it as a Kuhnian revolution in biomedicine. The path taken in this paper offers a different story. The paper is an exploration of the early episodes of the biography of the microorganism which was to be named Helicobacter pylori, between 1982 and 1985. It rests upon a close reading of the first-hand accounts – including the original published articles by Warren, Marshall and their co- workers and later first-person narratives by Warren and Marshall of how bacteria in the gastric tract moved from an “impossible” entity to an epistemic object and to an established biomedical fact. In the process, new and strong associations were created between bacteria, tissues taken from diseased patients, symptoms and lesions identified through the mobilization of procedures from pathology, endoscopy, microbiology, electron microscopy and a controversial episode of self-experimentation by Marshall. The approach mobilized for this exploration draws on the tracing of what Ludwik Fleck (1935) described as active associations among heterogeneous entities and processes and the ensuing passive associations which allowed H.p. to emerge as a biomedical fact, and on the resources of actor-network theory.
Transgenics are a type of complex and novel biotechnological systems whose impact in society and in the environment is yet unknown. To evaluate and understand their impact it is necessary to know its nature and history in order to act and take informed decisions.
The mass media, press in particular, have a great power to inform about these issues and also to penetrate in people ́ ́s lives. The media have played an important role in influencing public opinion about environmental matters. A great number of people obtain this kind of information from newspapers. Thus, the communication of scientific subjects has influenced the ideas that people have about science.
But the communication about transgenics in newspapers has been characterized by fragmented and confused information, without scientific basis and with low credibility; much of this coverage is isolated event- centered and do not consider wider issues as the context in which they occur.
The aim of this talk is to use transgenic corn in Mexico as a case study in order to analyze a technological system and characterize the way in which different social actors perceive this kind of corn. This is done trough the analysis of the debate that took place in Mexico between 2000 and 2004. We analyzed the coverage of 4 selected national newspapers throughout the mentioned period, as well as semi - structured interviews to different social actors involved in this case. According to the results, the coverage reflects a proliferation and misunderstanding of complex terms, particularly in political and journalist sectors. In general there is no explanation of this kind of terms to acknowledge its comprehension and journalists often use them only to focus their reader’s interest.
Session: VII.6 Room: Peter Chalk 1.5 Time: Friday 16:30-18:00
Irama Núñez, Ana Barahona
UNAM, Mexico, D.F., Mexico
Session: V.6 Room: Peter Chalk 1.5 Time: Friday 11:30-13:00
Laura Nuño de la Rosa
Transgenic corn through the perspective of communication
Historical studies of technology have showed that in general there are three basic characteristics of all technological systems: first, motive or reason matters (there is intentionality, that is, the inventor’s intentions are incorporated into the structure of the object). Second, these motives or reasons are not determining, sometimes the objects are modified or redesigned for other different purposes. And third, the technological systems have non- intentional consequences. These characteristics of technological systems urged the generation of knowledge related to the history, sociology and economy of science and technology in order to anticipate and prevent some of their consequences.
Complutense University of Madrid, Madrid, Spain
A reconstruction of the conceptual phylogeny of Pere Alberch within the tree of EvoDevo.
The study of the relationships between evolution and development lived a long eclipse after the triumph of experimental embryology and the rise of population genetics. Pere Alberch belongs to the series of biologists who aimed to integrate development within an incomplete evolutionary synthesis. He was well aware of the long history of the analogical tradition. Indeed, one of his earliest papers was a quantitative elaboration of the research program defended by S. J. Gould in his historically grounded Ontogeny and Phylogeny. Since
then, von Baer, Agassiz, Haeckel, De Beer or Waddington became an admittedly recognized inspiration for his research.
The history of EvoDevo is not a linear but a branching one. I will sketch the conceptual tree of EvoDevo through the answers given to some basic questions concerning the definition, phenomenology, explanation and methodology of the study of development and evolution: What is development and What is evolution? Which are the main developmental stages and how are they related to the history of life? Which are the causes of both development and evolution and how are they connected? How should we face all of these questions?
Concerning definition, Alberch defined development as a dynamical system and evolution as constrained by the nature of the former. On the phenomenological side he was an advocate of the discontinuity of organic change, both in ontogeny (e. g. considering some developmental events as bifurcations) and in phylogeny (when defending the discrete character of morphospace). Regarding explanation Alberch was a mechanicist conceiving development as a dynamical and interactive system (focusing his research on the cellular and tissue aspects of pattern formation and morphogenesis ). Finally, concerning methodology Alberch became a great contributor for an integrative biology (being one of the pioneers in conciliating experimental embryology with phylogenetic research) as well as for the mathematization of biology (by using the tools developed with the study of complex systems).
Departing from these answers to the posed questions, I will trace the phylogeny of the Alberch’s ideas within the EvoDevo tree, finding his conceptual ancestors and distinguishing which of his contemporaries belonged to his species and which did not.
Time: Friday 9:30-11:00
Since then, a series of conferences were carried out, especially in the 1960 ́ ́s and 1970 ́ ́s from which a new discipline, “the exobiology” arose. This gave origin to a brand new scientific community focused on the research of life beyond Earth. This scientific community was formed by physicists, chemists and engineers, especially those of SETI. Now, as Kuhn says, “a scientific community is composed by those who practice a scientific field, who are bound by common elements regarding their education are responsible for the persistence of a shared set of objective goals including the training of its successors. To a great extent, the members of a certain community will have absorbed the same literature and will have obtained similar lessons out of it”. According to this, every research that is based on shared paradigms, should be guided by the same rules and standards to carry out scientific practice. However, Kuhn maintains that practitioners (of widely separated fields) may acquire rather different paradigms, which depends on the courses they have followed, the text books they have read, etc. A paradigm might be different to many scientists; therefore, it can simultaneously determine several traditions of normal science that overlap without being coextensive. This is the main issue. Unquestionably, the paradigm that unified Biology as a science has implicit differences to those scientists involved in the search of extraterrestrial signals. Evolution to these scientists is progressive; because they think that intelligent creatures can appear after thousands of million years thanks to evolution. On the contrary, Evolution to modern biologists does not imply the idea of progress or direction in any sense. When Biology was born, in the 1950s, the formation of its scientific community assumed the implications of the theory of biological evolution. The builders of the synthetic theory of evolution: Simpson, Dobzhansky and Mayr, marked the beginning of a point of view in the land of exobiology. They discarded the possibility that organisms similar to human beings could physically evolved as they maintain that an analogous evolution to intelligence is extremely unlikely.
Time: Saturday 14:00-15:30
Session: IV.8
Room: Peter Chalk 2.6
Carlos Ochoa Olmos
National University of Mexico, Mexico City, Mexico
Search for Extraterrestrial Intelligence: a Kuhnian approach on the matter
The idea of “search for extraterrestrial intelligence” (SETI) was developed in 1959 when Philip Morrison and Giuseppe Cocconi of the University of Cornell proposed a method to establish communication with extraterrestrial civilizations. At the same time, the physicist Frank Drake got to the same idea, but unlike them, he carried out the search by radio telescopes. An organization based on the possibility of interplanetary communication known as SETI, emerged from this first approach to search for extraterrestrial intelligence.
Session: X.5
Room: Peter Chalk 1.3
Jay Odenbaugh
Lewis and Clark College, Portland Oregon, United States
Robustness, Multiple Models, and Realism
In theoretical population biology, modelers often use what is sometimes called “robustness analysis” to evaluate features of their models. In this essay, first I argue that one fruitful way of thinking about robustness is as follows: a prediction of a model is /robust /just in
case one can replace an assumption(s) with other assumptions which are logically independent of the first and show that the behavior of the dynamical system(s) with respect to that prediction remains the same or close to it across independent assumptions. Second, I argue that in this sense, the search for robust predictions is an effective tool for relieving doubts about specific idealizations since a robustness analysis can show that a given prediction does not depend on that idealization; moreover, this provides a rationale for exploring multiple models. Third, I argue that simply showing a prediction is robust or invariant over a set of models need not be evidence for believing that prediction to be true – one still requires independent empirical testing of the model’s assumptions or predictions.
Time: Saturday 11:00-12:30
reproducing alone, coming together to form a group; therefore, there is the potential for selection to operate at both the individual and the group level during a transition. In any such multi-level scenario, it is important to consider the potential interaction between the two levels of selection. Additionally, it is important to ask whether one level of selection might ‘causally exclude’ another, i.e. whether selection at one level might generate, as an unintended side-effect, a character-fitness covariance at another level, higher or lower, and thus the appearance of direct selection at that level. I argue that such ‘cross-level byproducts’ are likely to be ubiquitous during the early stages of evolutionary transitions, when the ‘groups’ are still loose coalitions of interacting individuals. This is illustrated with reference to Michod’s models for the evolution of multi-cellularity. Finally, I show that the notion of a cross-level byproduct suggests a natural answer to the question of when a group of lower-level individuals constitutes a genuine evolutionary unit.
Session: IX.1
Room: Newman B
John Odling-Smee
School of Anthropology, University of Oxford, Oxford, United Kingdom
Niche Inheritance: Its Implications for Human Cultural Inheritance
The theory of niche construction adds a second general inheritance system, ecological inheritance, to evolution (Odling-Smee, et al., 2003). Ecological inheritance is the inheritance, via an external environment, of one or more natural selection pressures previously modified by niche-constructing organisms. This addition means descendant organisms inherit genes, and biotically transformed selection pressures in their environments, from their ancestors. The combined inheritance is called niche inheritance. Niche inheritance is used as a basis for classifying the plural, genetic and non-genetic inheritance systems in evolution proposed by Jablonka & Lamb (2005). Implications of niche inheritance for human cultural inheritance are discussed.
Time: Thursday 14:30-16:00
Evolutionary Transitions, Levels of Selection, and Cross-Level Byproducts
This paper examines a philosophical issue arising from the biological literature on evolutionary transitions. According to a widely held view, multi-level selection is crucial for understanding evolutionary transitions, for such transitions involve a number of free-living individuals, originally capable of surviving and
Session: II.3 Room: Newman D Time: Thursday 14:30-16:00
Session: II.2
Room: Newman C
Metagenomics and the proteorhodopsin case: Exploratory experimentation and its transformative effects
The current state of metagenomics, a subfield of molecular microbiology, readily lends itself to a diagnosis of exploratory experimentation. Through its deployment of a rapidly growing battery of techniques, interpretations and applications, metagenomic programmes of research are opening up vast new territories of microbiological investigation while simultaneously revising many of the most basic conceptual frameworks in microbiology. These revisions include the concept of species, the tenet of ageographical distribution (‘everything is everywhere’), and the fundamental ontological category of individual organism. For many observers, metagenomics merely refers to the sequencing of environmental DNA and its informatic analysis – different perhaps in quantity but not kind from ordinary single-organism microbial genomics, and similarly ‘mindless’. Although the practices covered by the label of metagenomics are diverse, most of them are not intended to be experimental in the sense of testing theoretically derived hypotheses. The metagenomic aim of investigating and systematically understanding currently uncharacterized microbial phenomena (entities and processes) means, however, that the description of ‘exploratory experimentation’ is a highly
Samir Okasha
University of Bristol, Bristol, United Kingdom
Maureen O’Malley
University of Exeter, Exeter, United Kingdom
M.A.O’
apt one. In this presentation, I will first make a case for how metagenomics in general can best be understood as exploratory experimentation and how the field’s flexible techniques greatly extend the original scope of the research tools in unanticipated ways and sidestep some of the conceptual blockages imposed by more conventional microbiological research programmes. Second, I will show how exploratory experimentation has worked in one case of metagenomic analysis, the unexpected finding and subsequent investigation of proteorhodopsin genes in oceanic bacteria by Ed DeLong and colleagues. Finally, I will draw out the broader implications of this case and metagenomic inquiry for the notion of exploratory experimentation and how philosophers of biology might further develop it.
Session: VI.9 Room: Peter Chalk 2.5 Time: Friday 14:30-16:00
Session: III.3 Room: Newman D Time: Thursday 16:30-18:00
Cytology Textbooks, Multidisciplinarity, and the Making of the New Science of Aging in the United States, 1924-1945
The role of science textbooks in discipline-building has been discussed by several historians. Thomas Kuhn regarded textbooks as misleading because they contained only “the finished achievements” of research, which obscured the complex pathways of actual scientific development. Yet more recent historians, such as Andrew Warwick, David Kaiser, and others, have shown that textbooks have been a significant factor in constructing scientific disciplines and practices under various contexts, such as pedagogic imperatives, academic cultures, and larger socio-political constraints. In this paper, I also argue that textbooks, in particular multi-authored and edited volumes, are not so much a finished achievement as an ongoing project of constant interactions and negotiations. Particularly, I trace the works of the Canadian-American cytologist Edmund Vincent Cowdry, who edited General Cytology (1924), Special Cytology (1928), and Problems of Ageing (1939). The first two textbooks were contributed by many scholars in various branches in biology without any single unifying view. As Jane Maienschein has pointed out, this reflected not so much a weakness of Cowdry’s editorship as the contemporary American biologists’ ideal of cooperation among the various specialists, which indeed represented the research at the Marine Biological Station at Woods Hole. Cowdry edited the first handbook on aging research, Problems of Ageing, in a similar way. Rather than focusing on a single topic, he tried to include in his handbook researchers from diverse fields in life sciences such as cytology, physiology, pharmacology, botany, and animal husbandry. As he had done in editing cytology textbooks, he encouraged cooperation among these scholars by sending each author’s chapter to others with similar interests. This led to further discussions, helpful exchange of comments, and sometimes, to debates. Through this process, the contributors to the book felt that their cooperation would help the society find the solution to the “problems of aging,” when the Great Depression compelled rethinking of age as well as gender and race, which eventually led to the making of the Social Security Act. While remaining as specialists in their own discipline, these contributors began to think that they belonged to another new field, gerontology, consisting of scientists from various fields. I will show that they founded the Gerontological Society
Lisa Onaga
Cornell University, Ithaca, NY, United States
Silkworm Breeding and the Development of Genetics in Meiji Japan
An examination of silkworm breeding in Japan reveals a critical yet understudied aspect of Japanese experimental biology: the relationship between silk manufacture and the emergence of modern Japanese genetics. The silkworm was an exemplary research organism and also became an object of contestation as craftspeople, industrialists, and scientists sought to improve the qualities of spun silk bound for Europe and America during the reign of the Emperor Meiji (1868- 1912). Sericulture, the rearing of silkworms, is on one hand associated with craft knowledge held by agricultural families. Yet, it also concerns the work of scientists such as Toyama Kametaro, whose selective breeding studies in the early 1900s helped develop expertise in the production of commercially-valuable F1 hybrid silkworms. Much of the technical work of agricultural researchers in Japan such as Toyama suggested that Mendelian genetics is at play in silkworms. Considering the backdrop of societal changes that took place in Japan in the decades following the end of its policy of national isolation, this paper will focus on how sericulture knowledge transformed as commercial silk production gained stride at the turn of the twentieth century. The emphasis placed upon silkworms by Japanese researchers points to an agricultural basis for the growth of biology in Japan as well as questions concerning the role of biological research in signifying a change in the ownership of sericultural knowledge at the time.
Hyung Wook Park
The University of Minnesota, Minneapolis, Minnesota, United States
in 1945, which promoted gerontology as a multidisciplinary science. In this sense, Cowdry’s textbooks and handbook functioned as a “boundary object,” which, as Ilana Löwy and Susan Leigh Star have shown, enables different disciplines to talk to one another.
map. A linear genotype-phenotype is simply assumed by Evolutionary Psychologists; furthermore, their essentialist perspective obviates a multilevel analysis of dynamic phenomena.
Evolutionary Psychology is committed to the existence of psychological mechanisms in the mind, which evolved by natural selection (Cosmides and Tooby, 1992; Buss, 2004). One methodological problem of Evolutionary Psychology, according to Lloyd, is to show, in an empirically adequate manner, that a trait is an adaptation. For instance, in the case of sexual selection, beauty is considered an important adaptative trait for mate selection; where female youth, fertility and health are signals of beauty. (Busss, 1994; Thornhill, 2003). In the mate selection process, more beauty (i.e., more youth, fertility and health) is thus strongly positively correlated with the reproductive success of the female. Furthermore, Oyama’s DST/DSP analysis sheds light on a second type of problem with these explanations: they simply assume that evolution consists solely (and reducibly) of changes in gene frequencies, with genes causally corresponding to phenotypes in a fairly simple one-to-one manner. In the process of mate selection these characteristics indicate the presence of direct fitness benefits that enhance the reproductive success of the selecting mate.
In my talk, I present a bibliographical review of the role of beauty for mate selection. Starting from the methodological injunction of avoiding “bad science,” I employ Lloyd’s framework to assess when, if ever, evolutionary psychologists have shown, in an empirically adequate manner, that beauty is an adaptation for mate selection. I use Oyama’s work to move beyond a gene-centered evolutionary psychological approach in “the interest of the eventual synthesis of a complex, multilevel reality”. (Oyama, 2000, p137)
Room: Peter Chalk 1.1
From transmission to plasticity: the changing concept of heredity since the middle of the twentieth century
The paper will explore two central concepts in modern genetics: heredity and plasticity. It will be argued that the relation between these two notions is historically more intricate than usually assumed. Heredity is usually equated with trans-generational transmission processes, whereas plasticity is considered to be a developmental aspect, which is only indirectly related to DNA-based heredity. This paper will argue for the instability of the concept of heredity itself. It will be claimed that the transmission-based view of heredity has reached its zenith already in the middle of the twentieth century. Since the 1950s, the concept has been constantly modified to incorporate more and more phenomena which are essentially non-transmissional. Form another perspective, I will argue for a more general trend in the biomedical sciences, from the understanding of life processes in terms of specific agents and their effects, to a view of the organism as a reactive and plastic system. A historically parallel development can be observed in very different fields like immunology and neurobiology.
Time: Friday 11:30-13:00
Zentrum für Literatur- und Kulturforschung, Berlin, Germany
Session: V.4
Ohad Parnes
Yuridtizi Pascacio-Montijo
Instituto de Investigaciones Fiolosoficas, Mexico, Mexico
The New Mechanistic Philosophy and the Mechanism of Competition
I explore whether competition, a putative ecological mechanism, can be adequately characterized from the point of view of the ‘new mechanistic philosophy’. At the forefront of this philosophy are the conceptions of Glennan, of the team of Machamer, Darden and Craver, and of Bechtel. These authors think of scientific explanation in terms of discovery of mechanisms that produce phenomena. And their conceptions of mechanisms are meant to make sense of a great
Beauty, Mate Selection and Evolutionary Psychology: A Critical Review
I analyze the proposals of Elisabeth Lloyd and Susan Oyama regarding the explanatory role of evolutionary theory in the discipline of Evolutionary Psychology. Lloyd in The Case of the Female Orgasm suggests the operation of two main biases in Evolutionary Psychology: adaptationism and androcentrism. Oyama in Evolution’s Eye argues that a central problem concerns how we understand the genotype-phenotype
Session: IX.2 Room: Newman C Time: Saturday 11:00-12:30
Viorel Pâslaru1,2
Session: VIII.9 Room: Peter Chalk 2.5 Time: Saturday 9:00-10:30
1) University of Cincinnati, Cincinnati, OH, United States, 2) Romanian Academy, Bucharest, Romania
number of cases of scientific explanation. I examine first the differences and similarities in the foregoing accounts regarding the explanatory role of mechanisms, their components, organization, and constitutive causal relationships. Then, I consider the experimental investigation of competition due to Gause, who speaks of a mechanism of competition. In light of scrutinizing the aforementioned conceptions of mechanism, I identify elements of the mechanism of competition in Gause’s explanation of this ecological phenomenon. I show that the conceptions of mechanism appropriately characterize some aspects of the explanatory role and of the componential aspect of the mechanism of competition, but miss most of its organizational and causal facets. Accordingly, I suggest a way of thinking about the mechanism of competition in terms of invariant and insensitive causal relationships of substance transfer.
Time: Thursday 16:30-18:00
Mums as the Measure of Men: Global Plant Culture in the Nineteenth Century
Plant improvement seemed a typical element of the Industrial Revolution. With a new international bourgeois market and with increased access to American and Asian raw materials (germplasm), amateurs and nurserymen in western Europe and the U.S. produced endless novelty—new and improved plants—that demonstrated the superiority of western culture. In the last third of the century, however, the spread of Japanese plants and gardens raised significant questions about that self-satisfied narrative.
Session: VII.9 Room: Peter Chalk 2.5 Time: Friday 16:30-18:00
Albert Peacock
Florida State University, Tallahassee, FL, United States
Session: III.3 Room: Newman D
Vestiges of the Natural History of Creation in America: A Quick Response in the Years 1844-1847
Wilton Millhauser, in one of the first texts discussing the impact of Vestiges of Creation, states that there were more editions of Robert Chambers’ anonymously written book circulated in the United States then in Britain. Although, with more Americans owning the text, very little scholarship has been done on the work’s impact in the United States. In this paper I will attempt to discuss and make some generalizations about the direct impact of Vestiges and its sequel, Explanations, in the years directly after publication. The main argument will be drawn from periodical publications in the time period, some I believe that have not been mentioned by the few recent scholars working on this project. I will discuss why many publications recommended the reading of Vestiges to their readers, while others dismissed the book entirely. An argument will also be made about the cultural reaction of the Vestiges in America. It became very popular in public discussion to try and guess who the anonymous author of Vestiges of Creation was as well as attend public lectures on the subject, such as those of the physician John Augustine Smith. It is for these reasons that my study of Vestige of Creations will not only highlight the intellectual impact in America, but also the social activity caused by the book’s publication.”
Philip J. Pauly
Rutgers University, New Brunswick, NJ, United States
This thesis is drawn from examination of the movements of a number of different plant types; I will emphasize roses and chrysanthemums to convey the contrast between my perspective and that of Michael Adas’s well-known Machines as the Measure of Men. The paper as a whole is an effort to expand the subject matter of the history of biology.
Trevor Pearce
Session: III.7 Room: Peter Chalk 1.6 Time: Thursday 16:30-18:00
University of Chicago, Chicago, IL, United States
The Scorpion’s Sting – Functions, Mechanisms, and Biomechanical Explanation
Why are scorpion stingers 25% zinc? Why don’t tiny marine organisms swim like fish? These are just two examples of the kinds of questions asked by scientists working in biomechanics, a subfield of biology in which engineering techniques are used to investigate organisms and their environments.
Given that biomechanics studies “mechanical design in organisms,” as the title of one textbook has it, it is somewhat surprising that the so-called “new mechanistic philosophy” in the philosophy of science has paid it little notice. The central focus of this literature to date has been
microbiological, e.g. cell biology or neuroscience (Bechtel 2006, Darden 2006, Craver 2007). The goal of this paper is to investigate how mechanisms and functions operate at the macrobiological level in the field of biomechanics, and what role they play in biomechanical explanations.
After a brief overview of previous work on function, mechanism, and explanation, I will present a model of biomechanical explanation based on a case study of metal fortification in scorpion stinger cuticle. The model separates mechanisms into two classes: those leading up to the fact to be explained (upstream) and those following from the fact to be explained (downstream). I will argue that functional explanations are best seen as involving downstream mechanisms, whereas evolutionary explanations are best seen as involving upstream mechanisms.
Despite the claims of some philosophers, scientific explanations cannot be equated with detailed descriptions of mechanisms – in fact, genuine functional explanations can be presented even if downstream mechanisms are unknown, poorly described, or simply assumed. Moreover, discovering the function of some structure or behaviour does not itself constitute an evolutionary explanation; rather, such explanations involve comparative phylogeny and the elucidation of upstream mechanisms.
Philosophers of biology have a responsibility to investigate as many different biological research programs as possible before making claims about scientific concepts and methodology; this paper argues that biomechanics is a good place to start.
millennium has seen the emergence of a biological postgenomic era in which the functional genomic, the proteomic and the interactomic become very important in biological practices. We will tend to determine if this new postgenomic era can produce a new dominant biological paradigm. For this purpose, we will underline the increasing use of new concepts and the return of old concepts in the theoretical universe of biologists. Furthermore, new kinds of experimental approaches of living beings are developed. All those changes seem to account for internal and external systemic dimensions of living beings. However, such approaches are only possible in a postgenomic era because they are built from data of genomic. Is a new systemic paradigm appearing and signalling the end of the molecular paradigm? We will try to answer to this question in a philosophical and historical perspective. The behavioural genetic is our favourite model to study the changes occurring. Indeed, this research domain is specific to a strong confidence in the explanatory power of genes and we will question the future of behavioural genetic in a postgenomic era.
Session: IX.7 Room: Peter Chalk 1.6 Time: Saturday 11:00-12:30
Kornblith on Knowledge: Reliability, then or now?
In Knowledge and its Place in Nature, Hilary Kornblith presents an account of knowledge as a natural kind, one that both humans and non-human animals share. Kornblith argues that knowledge is true belief produced by mechanisms that were reliable in the environment of evolutionary adaptation. In this paper I argue that Kornblith’s account fails to be an interesting account of knowledge as the concept of reliability that is of importance to us is one that pertains to the modern environment. I argue by showing that while a particular belief may have been reliable in the environment of evolutionary adaptation, a belief produced by such a mechanism should not count as an instance of knowledge as it is just a matter of luck that the individual in question has a true belief.
First I argue that, in the environment of evolutionary adaptation, individuals had a disposition to form beliefs, in the right circumstances, that out-group members are dangerous or likely to be more aggressive. Research seems to indicate that it is plausible that xenophobia in humans was selected for and beneficial to survival. In addition, it seems likely that a mechanism producing beliefs of the sort “out-group members are hostile or dangerous” would have produced true beliefs more often than not in the environment of evolutionary adaptation.
Laurence Perbal
Free University of Brussels, Brussels, Belgium
The Postgenomic Era and a New Systemic Paradigm in Biology?
In this talk, we will tend to clarify, in a philosophical and historical approach, the possible paradigmatic revolution that biology actually undergoes. Indeed, the end of the 20th century is named the genomic era of biology and this period is specific to a very strong confidence of the researchers in the explanatory power of genes. Some consequences of this confidence were intense activities of genomes sequencings and a conceptual universe full of genes and genetic programs. It is the reign of the molecular paradigm. However, because of the advancements of research and the (re)discovery of increasing molecular complexity, the concepts and the experimental approaches of genomic era are questioned. The beginning of this new
Session: XIII.10 Room: Peter Chalk 2.3 Time: Sunday 11:00-12:30
Heather Perez
Florida State University, Tallahassee, Fl, United States
The second half of the argument is concerned with the modern environment and evidence from social psychology. I argue that in the modern environment, individuals retain the mechanism discussed above. Experiments have shown that out-group members are likely to be viewed as aggressive or dangerous. However, unlike the environment of evolutionary adaptation, the mechanism is not generally reliable in the modern environment. Therefore, when one has the true belief “that particular out-group member is dangerous” it is just a matter of luck and should not be considered knowledge.
those that adopt an engineer’s way of analyzing the performance of a machine and apply “analogous approaches” to the study of protein machines.
We set out to establish whether the molecular machine analogy actually plays such a role in current molecular biological research. Empirical data regarding usage patterns of the analogy in literature and conferences, and an astounding multitude of definitions of machine suggest that rather than guiding practical research as a model, the analogy is used metaphorically and functions on a social level by helping to define a subset within the molecular biological community.
Session: XI.6 Room: Peter Chalk 1.5 Time: Saturday 16:00-17:30
Winfried Peters, Suin Roberts, Bernd Buldt
Indiana/Purdue University Fort Wayne, Fort Wayne, IN, United States
Session: V.10 Room: Peter Chalk 2.3 Time: Friday 11:30-13:00
University of Maryland, Baltimore County, Baltimore, MD, United States
Selection vs. Drift: Apportioning Causal Responsibility
This paper considers whether and to what extent we might apportion causal responsibility between selection and drift. Elsewhere I have argued that, if the probabilities involved in natural selection result in part due to abstracting from or ignoring certain features of the environment, then selection and drift might be understood as causally distinct. Drift is (in part) caused by the unequal distribution of ignored factors, while selection occurs due (in part) to those features not being ignored. In this paper, I develop this view further. I argue that the factors that ought to be ignored or abstracted from are those features of the environment that would not make a difference to differential survival or reproduction were they equally distributed across the competing types in the population. However, when the ignored factors are not equally distributed across competing types, drift can result in evolutionary change. I consider whether and to what extent this view will allow us to distinguish how much of evolution is due to selection versus how much is due to the causal factors associated with drift.
Molecular Machines – A Metaphor in the Making
Machines have been utilized as sources for explanations by analogy ever since biological research has adopted an experimental approach in the modern sense. The complete dominance of explanations of organismic functions through analogizing organisms with mechanic machines in the works of BORELLI, HARVEY, and HALES in the 17thand 18th century is a case in point. Although the meaning of the term machine in its Western cultural context changed substantially since then as the prototypical machine underwent an evolution from levers and screws over steam engines to TV sets and automobiles, and though its connotations might have differed significantly in different languages at different times, there can be no doubt that modern biology followed a technomimetic mode of reasoning throughout its history.
In 1998, the then president of the National Academy of Science (USA), B. ALBERTS, summoned the molecular and cell biological community to “prepare the next generation of molecular biologists” to view “the cell as a collection of protein machines” (Cell 92, 291-4). Subsequently, the frequency of terms such as protein machine in the pertinent journals increased year after year (as recorded in literature databases) and Molecular Machines has become a common title for conferences and sessions (e.g., this year’s biggest molecular biological meeting in Europe). But what exactly is the nature of this emerging machine analogy in molecular bioscience? ALBERTS called for the establishment of a new Kuhnian paradigm (intentionally or not) as he stated: “prototype investigations that are clearly explained and reexplained in review articles and textbooks can help to shape this exciting new field and to recruit young scientists” (loc.cit.). Suitable prototype investigations would be
Session: VII.10 Room: Peter Chalk 2.3 Time: Friday 16:30-18:00
Jessica Pfeifer
Jérôme Pierrel
IRIST-LESVS, Université Louis Pasteur, Strasbourg, France
Sequencing RNA In The 1960s And 1970s: An “RNA World”?
In the decade before DNA sequencing was feasible, scientists around the world began RNA sequencing. This
era began in the 1960s with a 77 nucleotide long tRNA (Holley et al, 1965). It ended in 1976 with the first complete genome, the 3569 nucleotide long RNA virus MS2. One year later, the 5375 nucleotides of the DNA virus phiX174 were completed in Cambridge. Then, most of the sequencing effort turned out to be a DNA one. The historiography of molecular biology has often either depicted the heroic period of the phage group, or analysed the recent development of DNA genomics. I depict what might be called an “RNA world”, which may be compared with the “protein world” of the 1950s and the subsequent DNA genomics, so as to shed light on the transformations of molecular biology - from the bench to the genome plant – since World War II.
This paper describes the sequencing projects of two biochemists turned molecular biologists: MS2 in Walter Fiers’ lab in Ghent, Belgium and transfer and ribosomal RNA in Jean-Pierre Ebel’s lab in Strasbourg, France. These projects relied on methods inherited from the ««protein world»», such as paper chromatography, but enzymatic methods reached considerable refinement with RNA and DNA sequencing. RNA and DNA specialists borrowed their enzyme and fractionnation “tricks” from each other. Nevertheless, RNA sequencing had “a life on its own”. In contrast to protein and DNA, there was neither automatic sequencing, nor computer use in RNA efforts. Unlike DNA projects, each RNA was a specific problem. This led to the establishment of specific technical and social devices to unravel each structure. Thus, I argue that these long-lasting efforts (over about ten years) were both tinkering and engineering, both innovation and routine. As such, they represented an essential intermediary phase for the development of molecular biology of the second half of the twentieth century.
and, perhaps more quietly, ecology. Partly because of the so-called “hardening” of the Synthesis (i.e., an emphasis on natural selection as the primary motor of evolutionary change to the exclusion of other possibilities), and partly because of the concomitant spectacular progress of molecular biology, phenotypes – i.e., forms – gradually disappeared from center stage. By the 1960s and ‘70s the MS had become a theory of population and molecular genetics, retreating from, or dismissing as already part of its framework, problems such as the evolution of phenotypic novelties. A parallel intellectual tradition therefore emerged to cater to those biologists who were increasingly dissatisfied with the MS, a tradition that focused on phenotypes and their changes through time, and who found some of its most vocal (and often controversial) exponents in figures like Goldschmidt, Schmalhausen, and Waddington, to mention a few. Their concepts, respectively, of hopeful monsters, stabilizing selection, and genetic assimilation became at once well known and almost entirely ignored by practicing biologists until the second part of the 1980s. It was at that point that several researchers of a new generation (e.g., Stearns, Schlichting, West- Eberhard) began to use the concept of phenotypic plasticity – the property of a given genotype to produce different phenotypes in different environments – as the centerpiece for the renaissance of the phenotype. Within the span of two decades plasticity went from a nuisance to be avoided at all costs to a key word in evolutionary theory, and the empirical as well as theoretical research on plasticity succeeded not only in reviving interest in the role of ecology in evolutionary theory, but also in resurrecting and updating Waddington’s and Schmalhausen’s (though not Goldschmidt’s) work. At the same time, of course, the evo-devo approach was developing quite independently, with the goal of finally including developmental biology into the MS framework. We are now at an interesting crossroad where the Modern Synthesis still is in the dominant paradigm in evolutionary biology, with evo-devo and plasticity studies attempting to expand the conceptual and empirical framework of the Synthesis to finally include developmental biology and ecology. It is an open question whether these efforts will result in a significantly augmented evolutionary theory – enough to merit a new post-MS term to identify it – and, more importantly, whether the MS/evo-devo/plasticity trident will be sufficient to finally pin down difficult problems such as the evolution of phenotypic novelties.
Session: I.2 Room: Newman C Time: Thursday 11:30-13:00
Massimo Pigliucci
SUNY-Stony Brook, Stony Brook, NY, United States
The (almost) forgotten phenotype
Evolutionary biology started out as a theory of form. Darwin’s central problem was to explain the diversity of biological organisms using two fundamental principles: common descent and natural selection. It could have hardly been otherwise, considering that genetics, let alone molecular biology, was not on the horizon yet. The Modern Synthesis (MS) of the 1930s and 1940s, as is well known, pursued the unification of natural history (a la Darwin) and the emerging disciplines of Mendelian, population and quantitative genetics. It largely succeeded, though at the price of leaving out – notoriously – embryology and developmental biology
Session: IV.1 Room: Newman B Time: Friday 9:30-11:00
Session: XI.11 Room: Peter Chalk 2.1 Time: Saturday 16:00-17:30
Monika Piotrowska
Tina Piper
University of Utah, Salt Lake City, Utah, United States
McGill University, Montreal, Canada
What Does It Mean To Be 75% Pumpkin: The Units Of Comparative Genomics
Since the advent of comparative genomics, the media has reported several puzzling claims about genetic similarities between various species. For example, the claim that the genome of a mouse, as well as that of a pumpkin, are both 75 percent similar to that of a human; that the size of the wheat genome is five times larger than ours; or that even the genomes of similar organisms, like crickets and fruit flies, can differ greatly in size. Reading about such perplexing discoveries can either lead us to question the central role genomes play in phenotype development or it can lead us to question our understanding of the genotype-phenotype relationship. In other words, if a person notices the various phenotypic differences between a mouse and a pumpkin, but later reads that humans are 75 percent similar to both, she may either conclude that genes have a slim hold on phenotypic differences, or that the comparative genomic studies did not successfully identify all the differences between the two genomes. In this paper I argue for the latter conclusion so that the former one can be avoided. I do this, in part, by criticizing the narrow gene concept used by comparative genomics to guide genome alignment. While I agree with John Dupré that genomics is the successor science to traditional genetics (Dupré 2004), I believe that comparative genomic studies do not capture enough aspects of genomic complexity. One reason for this is that their computational results are based on the absence of an appropriately sophisticated gene concept. Although the relevant similarity statistics are being reported by the media and not the geneticists themselves, I believe these reports are symptomatic of the conceptual confusion present in the field. Furthermore, comparing species on the level of genomes, and citing their overall similarity as a percentage, may lead to a false conclusion that quantitative similarities and differences on the genetic level correspond to qualitative similarities and differences on the phenotypic level.
REFERENCES:
Dupré, John. 2004. Understanding Contemporary Genomics. Perspectives on Science, MIT Press, Vol. 12, no. 3.
An Ongoing Dialogue: Understanding Life Sciences through the Lens of Patent Law in the Early Twentieth Century
The history of patenting in the life sciences highlights the interplay between understandings of property and biology, interwoven with a preoccupation with professional self-determination by a range of actors.
Patent law is often the first attempt at formal legal regulation of an innovation in the life sciences. As a site of initial conflict it is a fertile entry-point to an inquiry into shared understandings of biology, property and life, both for what is and is not discussed. Of necessity, patent law requires that lawyers, doctors, scientists and policy-makers enter into a dialogue. A particularly striking example is that of medical methods, which were formally excluded from patentability in the early twentieth century. Historical research of medical and legal archives primarily from the period from 1890-1940 suggests that the definition of a medically unpatentable innovation was contingent on a social understanding of what lay within a medical doctor’s professional work. That legal boundary-drawing juxtaposed the norms of the medical profession against those of the scientific and legal community, as well as directly contrasting medical, scientific and legal understandings of biological innovation. The transatlantic nature of the scientific and medical professional communities and the domestic legal context further nuance this portrait. This paper will discuss what this dialogue can tell us about the law’s role in constructing an understanding of biomedical property in the early-twentieth century.
Session: I.9 Room: Peter Chalk 2.5 Time: Thursday 11:30-13:00
W. Brad Pitts1, Gregory J. Morgan2
1) University of South Alabama College of Medicine, Mobile, AL, United States, 2) Spring Hill College, Mobile, AL, United States
Evolution without Species:
The Case of Mosaic Bacteriophages
Recent work in viral genomics has shown that bacteriophages exhibit a high degree of genetic mosaicism, which is most likely due to a long history of prolific horizontal gene transfer (HGT). We argue that each of the most plausible attempts to classify bacteriophages into distinct species fail. Mayr’s biological species concept fails because there is no useful viral analog to sexual reproduction. Phenetic species
concepts fail because they obscure the mosaicism and the rich reticulated viral histories. Phylogenetic species concepts, even when extended to take into account reticulation, fail because there is no non-arbitrary distinction between recombination events that create a new viral species and those that do not. Bacteriophages, arguably the Earth’s most abundant biological agent, evolve without forming species.
Time: Thursday 14:30-16:00
Behavioral Genetics and the Shared/Nonshared Environment Distinction: How (Not) to Interpret Behavioral Genetic Research
Using the concepts of shared and nonshared environment, behavioral geneticists have generated findings about environmental influences on phenotypic outcomes. Such findings have been reported in popular literature and interpreted by some as support for the conclusion that “parents don’t matter.” In this talk, I reexamine this conclusion and argue that it is based on an equivocation between two different senses of ‘shared environment’: the technical conception employed by behavioral geneticists and a more intuitive notion that is often used in popular accounts. Behavioral genetic findings do not lead to the conclusion that parents don’t matter, but rather only that parents don’t have the kind of influence on their children that some traditional psychological theories may have expected. Identifying this misinterpretation in the popular literature, and the reason for it, suggests a need for philosophical analysis of the environmental concepts in behavioral genetics, as well as a role for philosophers at the intersection of science and society.
inspired by optimality explanations. First, reductionism misconstrues the relationship among competing explanations. Second, optimality explanations illustrate that there are principled reasons for different types of explanation for a single phenomenon, and that some explanations are best formulated at high levels of description. However, the extent of the anti-reductionist position suggested by optimality explanations is limited. Though optimality explanations need not include genetic information, they are nonetheless dependent on this type of information. I will briefly indicate the nature of this dependence. The position suggested by optimality modeling is, thus, intermediate between the poles of reductionism and radical disunity of science.
Session: II.5 Room: Peter Chalk 1.3
Kathryn Plaisance
Leibniz University of Hannover, Hannover, Germany
Session: X.6 Room: Peter Chalk 1.5 Time: Saturday 14:00-15:30
Representation and invention: animate embodiments
Biological and biotechnological inventions pose a specific problem for patent law. What counts as ‘disclosure’ of a plant variety, a nucleotide sequence, a receptor, or a metabolic mechanism of action? In patent doctrine, criteria of disclosure are said to follow from the basic normative premise of the ‘patent bargain’. According to this justificatory theory, the grant of a patent is conditional on a full disclosure of the knowledge embodied in the invention. The patent or the patent application should do enough ‘to put the public in possession of the invention’; it should enable those skilled in the art to make and use the invention after the expiry of the patent term, and in the meantime to make use of the knowledge contained in the patent text. In the age of mechanical inventions it was apparently easier to offer an adequate description of a patented artefact; that is, a description that married structural and functional descriptions in such a way as to make the invention (re)cognisable and reproducible to other artisans. Biological inventions have never quite fitted into this model of description. Throughout the 20th century one of the most consistent objections to the patenting of micro-organisms and plant varieties was not the ethical argument that at they were living ‘products of nature’ but the more pragmatic compliant that they could not be adequately defined and distinguished. The introduction of patents for asexually- reproduced plant varieties eased some of these concerns, and ever since it has been assumed that biological inventions can be described by a deposit of living materials rather than by text. What are the
Angela Potochnik
Stanford University, Stanford, CA, United States
Optimality Explanation as Anti-Reductionism
I have argued (Potochnik, forthcoming) that in some circumstances, optimality models best explain long term evolution. This is despite the fact that these models do not provide information about genetic transmission. This result is opposed to what would be expected, were reductionism a general goal of explanation. Reductionism suggests that explanations should be maximally inclusive and should describe the system at the lowest level possible. Optimality explanations have neither of these attributes. In this presentation, I will explore some criticisms of reductionism that are
Session: X.11 Room: Peter Chalk 2.1 Time: Saturday 14:00-15:30
Alain Pottage1, Brad Sherman2
1) London School of Economics and Political Science, London, United Kingdom, 2) University of Queensland, Brisbane, Australia
implications of recruiting biological process as a means of representing and conveying knowledge? How does the consistency of the intangible change in the shift from text to material?
outcomes that are stably replicated in that lineage” (Griffiths and Gray 1994, page 278). Therefore, the biological individual is the DS, not the organism.
Session: I.3 Room: Newman D Time: Thursday 11:30-13:00
notion of the organism? Dawkins (1982) tries to do so in an evolutionary context: genes, and not organisms, would be selected by natural selection. But is it possible to get rid of the notion of organism when one aims, as developmental systems theorists do, to account for physiological, especially developmental, aspects, of the living?
Alexander Powell
ESRC Centre for Genomics in Society, Exeter, United Kingdom
Emergence, causation and levels in biological systems
Despite the far-reaching accomplishments of molecular biology in elucidating numerous aspects of cellular and organismic biology, considerable doubt exists in the minds of many biologists as to whether complete knowledge of the parts of a biological system will suffice to explain the behaviour and properties of the whole. Several decades of intense development in scientific computing, and the development and assimilation of ideas from complexity studies and elsewhere, mean that researchers now have substantial resources with which to develop scientific responses to their doubts about molecular determinism. Already we see an increasing interest in the collective, in vivo and interactional aspects of biomolecular phenomena, and experimental techniques are being found to supplement in silico work. In addition there are a number of long-standing scientific lineages to build upon. But some argue that new theoretical paradigms will be needed in order to address the deepest biological puzzles. The concept of emergence looks set to be associated with systems thinking just as reductionism has come to be associated with molecular biology. I will outline some of the philosophical connections between emergence, causation and reduction, paying particular attention to the notions of mechanism and levels. I will argue that if the concept of emergence has a part to play it will be in helping us to develop a coherent and inclusive account of the causal possibilities of complex systems.
Time: Saturday 11:00-12:30
2. What does one gain by switching from the organism to the DS? In particular, are the boundaries of a DS defined more precisely than that of an organism? Susan Oyama (2003) has a very open view on this question: she argues that it is difficult to delineate both a DS and an organism, and suggests that rather than trying to delineate either one once and for all, one should consider what the theoretical or empirical task is.
This view raises at least three difficulties:
1. Is is conceivable that biology could do away with the
3. The proponents of the DST see in Richard Lewontin an important forerunner. Yet, Lewontin asserts the existence of a dialectical co-construction relationship between the organism and the environment, and therefore he considers that the organism concept should be central in current biology (Lewontin 2000). How is it possible to hold both the co-construction thesis and the view that the notion of organism should be abandoned?
Session: IX.3 Room: Newman D
Pontifícia Universidade Católica de São Paulo (PUC- SP), São Paulo, São Paulo, Brazil
Thomas Pradeu
IHPST, Paris, France
The Emergence of Themes of Research in the Epistolar Relation Between Lazzaro Spallanzani and Charles Bonnet
We usually read in the old historiography that the importance of the italian naturalist Lazzaro Spallanzani is due to his hability in making observations and experiments. According to this interpretation, Spallanzani would just repeated what
DST people aim at reconceptualizing the evolutionary theory and some of its basic notions, including that of organism (Griffiths and Gray 1994). According to Griffiths and Gray, from an evolutionary point of view, the concept of organism should be replaced by a larger category, that of “developmental system” (DS), which refers to “the resources that produce the developmental
REFERENCES:
Dawkins R. (1982) The Extended Phenotype, Oxford University Press.
Griffiths, P and Gray, R. (1994) Developmental Systems and Evolutionary Explanation, The Journal of Philosophy. Lewontin, R. (2000) The Triple Helix – Genes,Organism and Environment, Harvard University Press.
Oyama, S. (2003) Boundaries and (Constructive) Interaction, in Neumann-Held E. and Rehmann-Sutte C. (eds), Genes in Development. Re-reading the Molecular Paradigm, Duke University Press.
Session: XIII.11 Room: Peter Chalk 2.1 Time: Sunday 11:00-12:30
Maria Elice Brzezinski Prestes
was suggested or what was already done by others scholars. However, in the two last decades it has been defended a larger intelectual and experimental autonomy in his research program. In this paper, this controversy will be analised after the epistolar dialog between Spallanzani and Charles Bonnet – whom is often pointed out as his intelectual guide. In this sense, the 26 years of correspondance between them (from 1761 until 1791) can give us some parameters to understand the origins of the themes and problems of research as well as the procedures in observations and experiments developed by the italian naturalist.
follows Buffon’s program for investigating the real genealogy of living forms. In his second intervention, the methodology for teleological judgments of the Kritik der Urteilskraft (foreshadowed in his 1788 paper against Forster), Kant seems to have quite different aims. The question is now to what extent a teleological perspective can be justified in natural science (broadly understood), and Kant‘s solution is roughly that it is justified as a means of grasping organized beings as such and investigate their properties, provided that we restrict the validity of teleology and don ́ ́t ascribe it in an absolute sense to the beings in question. Kant also warns against the idea of explaining organization out of non-organized matter, criticizes the grand proto-evolutionary idea of an “archaeology of nature”, and in general seems rather skeptical against the possibilities of a genealogical study of natural forms.
As Phillip Sloan and others have argued, Kant seems to have changed his mind about the possibility and desirability of a genealogical approach to the life sciences. Connected to this issue is the question of whether the Kritik der Urteilskraft contains a positive methodology for biology, or if it is rather a negative program for restricting its ambitions and preventing it from taking the genealogical turn.
In this paper I will claim that there is both a positive and a negative part in Kant’s program, and that also in the Kritik der Urteilskraft there is room for Naturgeschichte, even though Kant undoubtedly is more skeptical in this phase against the possibility of extending it very far. I will also look at the example of Christoph Girtanner’s über das Kantische Prinzip für die Naturgeschichte (1796). Though Girtanner was not
a first rate scientist, it is interesting in the present context to note that he apparently sees no problem in combining the genealogical approach of Kant’s first intervention with the general views of teleology of the Kritik der Urteilskraft.
Session: XI.7 Room: Peter Chalk 1.6 Time: Saturday 16:00-17:30
Hernán Pringe
Univeristy of Pittsburgh, Pittsburgh, PA, United States
Teleology and Complementarity: Kant, Bohr, Biology and Atomic Physics
The Kantian influences on Bohr’s thought and the relationship between the perspective of complementarity in physics and in biology seem at first sight completely unrelated issues. However, the goal of this talk is to show their intimate connection. We shall see that Bohr’s views on biology shed light on Kantian elements of his thought, which enables a better understanding of his complementary interpretation of quantum theory. For this purpose, we shall begin by discussing Bohr’s views on the analogies concerning the epistemological situation in biology and in physics. Later, we shall compare the Bohrian and the Kantian approaches to teleology in order to show their close connection. On this basis, we shall finally turn to the issue of complementarity in quantum theory in order to assess what we can learn about the epistemological problems in the quantum realm from a consideration of Kant’s views on biology.
Session: XII.7 Room: Peter Chalk 1.6 Time: Sunday 9:00-10:30
Gregory Radick
Marcel Quarfood
University of Leeds, Leeds, United Kingdom
Södertörn University College, Huddinge, Sweden
Vervetese and its Contexts
In 1980 the ethologists Dorothy Cheney, Robert Seyfarth and Peter Marler, then based at the Rockefeller University, published the results of soon- famous fieldwork on the alarm calls of vervet monkeys. Using experimental playback of recorded alarm calls, the group claimed to have established that those calls were rudimentarily semantic, in that they conveyed information not just about the emotional state of the caller, but about the nature of the environmental threat
Kant’s shifting attitude towards Naturgeschichte and Girtanner’s synthesis
At different stages of his career, Kant made two rather different interventions in the life sciences. His papers on the human races in the 1770 ́ ́s and 1780 ́ ́s attempted to defend the idea of a genealogical approach to systematics (Naturgeschichte), criticizing the standard way of basing classifications on likeness of form (Naturbeschreibung). The program of Naturgeschichte
Session: VII.7 Room: Peter Chalk 1.6 Time: Friday 16:30-18:00
– specifically, whether a leopard or an eagle or a python was present.
As this work was and remains a shining example of the promise of the then-new cognitive ethology, close attention to its methods and their rationales is overdue. This paper will explore some historical and philosophical issues raised by one of the criteria that Seyfarth, Cheney and Marler applied in testing for semanticity in the vervet alarm calls – what we might call the “cross-contextual condition.” The idea was that an animal signal counted as (at least rudimentarily) semantic if it could be shown to elicit the same response across different contexts. Here is the most sustained comment on semanticity-as-context-independence in the 1980 papers:
“In our experiments, context was not a systematic determinant of the responses of vervets to alarm calls. Different alarms evoked different responses in the same context, and responses to some alarms remained constant despite contextual variation. For example, monkeys looked up when they heard eagle alarm calls, regardless of whether they were on the ground or in the trees. Given the variable role of context in determining responses to alarm calls, the most parsimonious explanation would appear to be that, for all those within earshot, each alarm represented a certain class of danger... Individual monkeys then responded according to the nature and degree of their vulnerability to that danger at that time.”
various mapping and sequencing projects on the optimal organisation of these programs, they all put forward various ways to measure the efficiency of the conducted work through statistical analysis, for example through means of planning their developments by defining quantified milestones. This led in return to different forms of rationalisation of production, equating its efficiency with productivity rates. Automation and the organisation of work division were the primary means to bridge the gap between the capacities of existing techniques and the goals to be achieved, sometimes redefining the problems to be solved or displacing the type of artefacts to be produced as the technical systems evolved. The various controversies that arose between groups of divergent interests put an even greater emphasis on the question of productivity, which soon became one of the prominent regulating norms of the field, resulting in an ever growing pressure to automate and rationalise the work division accordingly, so as to decrease the costs of production. From this respect, the analysis of the organisation of sequencing centres, highly reminiscent of industrial practices, and the growing autonomy of management skills in those centres are telling. Finally, the link between these practices and the appearance of what came to be known as “functional genomics” will be discussed. Understanding the variety of practices within this interpretative framework questions the usual claims defining genomics history as a revolutionary process, and challenges some of the usual dichotomies hastily put forward to describe this field of research (such as public vs private, network organisation vs factory-like production centers, etc.).
The question of whether vervet calls are semantic was here addressed by asking whether context systematically determined response. But is that the only reasonable gloss one might give to the cross-contextual condition? If not, what are the other options? And what considerations might have led the Rockefeller group to prefer one construal over others? This paper will try to answer these questions about vervetese and its contexts by recalling – with the help of newly discovered correspondence and other documents – forgotten aspects of the scientists’ experimental programme and its contexts.
Time: Friday 9:30-11:00
Session: I.11 Room: Peter Chalk 2.1 Time: Thursday 11:30-13:00
Session: IV.1
Room: Newman B
The Rats of NIMH and the Urban Crisis
The paper focuses on a series of experiments in overcrowding among rats and mice carried out by John B. Calhoun, an animal ecologist employed at the National Institute of Mental Health (NIMH) from 1956- 1986. Building upon studies of density and behaviour in ecology, these studies proved to be very influential in the social sciences. Calhoun identified various stress- related “social pathologies” that resulted from increased population density, such as violence, cannibalism, autism and sexual deviance. The paper seeks to explain why these experiments received so much attention in the fields of human ecology and environmental psychology, and how their results were interpreted and adapted to the study of the human condition. While psychologists, sociologists and urban planners sought to
Vincent Ramillon
MPIWG, Berlin, Germany
The material economy of genomic research: automation, work division, and productivity
I will argue that genomic research can be best characterized by a transformation of the way scientific work was envisioned as an instrument of production. Although no consensus was ever achieved between the
Edmund Ramsden
London School of Economics, London, United Kingdom
build upon Calhoun’s research, they were also careful to reinforce the boundary between the study of animal and human populations.
connection link between the neuronal coding of self and non-self perception. Moreover it is speculated in the homology of the mirror neuron systems to human beings that they are the basis of the neuronal conception of empathy, imitation and intention-understanding.
The aim of the discussion should be a short introduction of mirror neuron system both in monkey and human, the biological and philosophical consequences and the methodical and methodological problems, which are connected with this concept.
Session: V.6 Room: Peter Chalk 1.5 Time: Friday 11:30-13:00
Diego Rasskin-Gutman
Institute Cavanilles for Biodiversity and Evolutionary Biology, University of Valencia, Valencia, Spain
Evo-Devo Today
In the postgenomics era, the fundamental question concerning how complex morphological structures arise during development and change during evolution is starting to be answered by means of comparative data taken from gene sequencing, protein-protein interactions, and gene expression patterns throughout a great variety of model animals.
Departing from more integrative origins, the new evo- devo takes gene missexpression experiments (“functional” genomics) as their comparative data for analyzing the genotype phenotype mapping. These experiments are nothing but ways to manipulate genome space with effects on morphospace. However, these effects lack a mechanistic explanation; they rather show only the concurrence of gene expression with the formation of a morphological structure.
I will elaborate on the impact that the original research lines proposed by Pere Alberch throughout his scientific production has on today’s evo-devo. I will compare the mainstream agenda of today developmental genetics, which is starting to distort the original meaning and goals of evo-devo with the kind of research proposed by Pere Alberch and others who, like him, sought to focus the evo-devo questions on the emergence and transformation of form, using evidence beyond genetics—from the fossil record and embryology—to reach a rather non-gene centred view of the genotype phenotype mapping.
Session: VII.1 Room: Newman B Time: Friday 16:30-18:00
Ben Rathgeber
Philipps-University, Marburg, Germany
Mirror Neurons and Action-Understanding
Mirror neurons are a particular class of visuomotor neurons, which were discovered in area F5 of the monkey premotor cortex in the early nineties. They discharge both when the monkey does a particular action and when it observes another individual (monkey or human) doing a similar or same action (e.g. grabbing a peanut). That is the reason why they appear to be the
to reach a universal consensus based on natural moral truths.
Jérôme Ravat
Université Paris 4 - La Sorbonne, Paris, France
Session: XI.6 Room: Peter Chalk 1.5 Time: Saturday 16:00-17:30
Can Naturalized Ethics Help Us Find Moral Truths?
Naturalization of ethics, thanks to recent research in evolutionary biology, neurology, and evolutionary psychology, has often been used to vindicate the existence of moral truths. This strategy is a key component of what is sometimes called “new wave moral realism”. According to naturalistic moral realism, solutions to moral disagreements could be found thanks to the reduction of moral properties to natural facts. This reduction might help us find out natural properties corresponding to the moral predicate “good” and could also resolve several issues and dilemmas in bioethics, such as the moral status of embryo, the legitimacy of abortion, or the validity of animal rights.
In this paper, I argue that “new wave moral realism” is highly questionable, because of the difficulties aroused by its naturalistic conception of moral truths. To this extent, I will refer to the “social intuitionist model” of Jonathan Haidt. According to this model, moral judgement is caused by quick, automatic intuitions, whose content is liable to broad variations due to social and cultural contexts. My claim, against naturalistic moral realism, is that moral judgements do not describe natural ethical facts. Rather, they express moral intuitions primarily based on emotional mechanisms. Thus, moral judgements in bioethics are the consequence of projections made through ontological intuitions namely related to consequentialist or deontological commitments. I contend that most interpersonal conflicts between those intuitions cannot be solved, and make it difficult, if not impossible,
Lastly, I maintain against moral realism that ethical applications of the naturalistic program must take into account the fact that natural objects do not share an
essence to which moral rules are applied. Rather, they are subject to several changes through natural selection, cultural evolution, and the development of biotechnologies. Those processes contribute deeply to reshape our moral landscape. I will examine the polemical issue of human reproductive cloning, to show that this innovation to come should not be perceived as a crime against the human species, but rather as a redefinition of humanity brought about by biotechnologies.
Time: Saturday 11:00-12:30
Abhandlungen zur theoretischen Biologie. In this series, leading figures of early 20th-century biology published or intended to publish theoretical monographs concerning a wide range of issues. Among them were Ludwig Bertalanffy, Hans Spemann, and Paul Weiss, main representatives of an organicist biology. In my talk, I present Schaxel’s criticism of the concepts of vitalism and mechanism and contrast it with some of the organicist ideas published in the Abhandlungen. I show how Schaxel developed from a participant to a central figure in early 20th-century biology who tried to organize its development rather than make genuine contributions.
Session: IX.1 Room: Newman B
Kenneth Reisman
Stanford University, Stanford, CA, United States
Session: XIII.11 Room: Peter Chalk 2.1 Time: Sunday 11:00-12:30
The Role Of The Environment In Human Cultural Inheritance
Much of the literature on cultural evolution assumes that the basic mechanism of cultural inheritance is social learning. In recent work, I have argued that learning from the environment can also contribute to cultural inheritance, especially when the environment is constructed and iteratively modified through behavior. This talk further examines the role of the environment in human cultural inheritance. I offer examples of how individual learning and niche construction interact, and assess the significance of such a process for human cultural evolution
Sylvène Renoud
Session: VIII.5 Room: Peter Chalk 1.3 Time: Saturday 9:00-10:30
The Relationship between Text and Images in Microscopy of Insects in the 17thCentury, The Example of Swammerdam
In 17thcentury, a new form of natural history appeared, founded on the idea that knowledge must be based on sensory experience, and on observation, supported by the use of instruments. The apparition of the microscope was an important element of this new approach to the study of the living world.
What place did the images hold in the work of those naturalists using microscopy? How do these microscopy images interact with the text? How does the use of images intervene in the research method and the understanding of the phenomena observed? These questions will be explored with a survey of the works of the main microscopy naturalists of the 17thcentury, and more particularly of Swammerdam’s General History of Insects (1685).
Although the composed microscope was first discovered in about 1610, it was not actually adopted by naturalists until the second half of the 17thcentury. In fact this astonishing instrument, despite its promises for theoretical possibilities, was not put into real use before the 1660s, when true micrography appeared with the works of Hooke, Redi, Malpighi, Swammerdam, or Leeuwenhoek. Thanks to the microscope, these naturalists made major observations, in particular in the burgeoning domain of entomology, which is evident in the innovative illustrations of their works. Paradoxically, one can note that “the invention” of the microscope in about 1610 concerns only the composed microscope, whereas several naturalists, including Swammerdam, prefered for their works the simple microscope, similar to the magnifying glass that had
Christian Reiß
MPIWG, Berlin, Germany
Julius Schaxel and the Emergence of Organicism in Germany, 1910-1933
Stressing both experimental and theoretical approaches to his science, the German biologist Julius Schaxel (1887-1943) participated in early 20th century discussions concerning the fundamental nature of biological processes. For Schaxel, the concepts of mechanism and vitalism did not satisfy the issue’s complexity. Searching for alternatives, Schaxel emphasized the need for a theoretical biology and, in the 1920s, acted as a central figure in the organization of this newly developing discipline in Germany. In a series of publications in the 1910s, Schaxel critically reviewed both the concepts of vitalism and mechanism. Part of these papers was a controversy with Hans Driesch, the central figure of early 20th century vitalism. Interrupted by World War I, Schaxel resumed his efforts in 1919 by editing the scientific series,
Université de Nantes, Nantes, France
existed since the 13th century. It was therefore not only a tool, but more specifically a usage that was discovered in the mid-17thcentury.
In his General History of Insects, Swammerdam argued that the idea of insects metamorphose was false. He proposed a study of insects and their development based purely on observation and experimentation. He then supported his argument with extremely precise copper engravings, illustrating microscope observations of insects at different stages of development, as well as dissections of structures such as the nervous system or the tracheal network
How did Swammerdam use these images in the elaboration of his theory? How did he succeed in integrating them in his text? How did the illustrations reflect the importance the author give to microscopy? How did Swammerdam use these images to legitimise the invisible? How did he give them testimony value? Finally, one can question the author’s will of objectivity regarding his images, as well as the idea of reconstruction in the visual representation.
path to take is to address epistemological issues regarding natural kinds first, with ontological issues coming second in line. On taking this approach, elaborating a philosophical account of natural kinds begins by examining which roles natural kinds play in science, i.e., by looking at how classifications of the subject matter of various scientific disciplines into natural kinds are actually being used in these disciplines’ practices of investigation and knowledge production, in their ways of reasoning and in the explanations that they provide. Once we know what these epistemic roles are, we can go on to investigate what the ontology of natural kinds must be to enable them to perform these roles.
In this paper, I shall take this approach to a case study from the domains of genetics and developmental biology. The gene category and particular gene kinds have long been considered as candidate natural kinds for these domains. This case study examines (1) in which ways the gene category, particular gene kinds, the developmental module category and particular kinds of developmental modules perform typical ‘natural kind roles’ and (2) which ontology of natural kinds would fit these roles. I shall argue that a non- essentialist notion of natural kinds is required to make sense of how biologists use these candidate kinds and thus is a useful tool for realizing philosophy of science’s central goal of understanding how actual science works.
Session: IX.7 Room: Peter Chalk 1.6 Time: Saturday 11:00-12:30
Thomas Reydon
Leibniz University of Hannover, Center for Philosophy and Ethics of Science, Hannover, Germany
How Scientists Use Kinds: Genes and Modules as a Case Study
Philosophers have long treated the topic of natural kinds as essentially a question of ontology. Starting from the assumption that the things in the world by their natures come in different kinds, the scientists’ traditional task was to tell us which natural kinds there are in the world and to uncover their real essences. Building on the scientists’ results, the philosophers’ challenge was to come up with an overarching account of the precise nature of the natural kinds discovered by science and of the ways in which these feature in everyday language, scientific reasoning and investigative practices. So far, however, this way of approaching the issue has not resulted in any generally accepted account of what exactly natural kinds are, which (if any) natural kinds there are in the world and whether natural kinds play any important role at all
An alternative approach to the topic of natural kinds is suggested by the fact that the notion of natural kinds has long (at least since Whewell, Mill and Venn formally coined the technical term) served philosophers as a tool for studying how scientific inferences are supported. From this perspective, a more promising
Session: VI.7 Room: Peter Chalk 1.6 Time: Friday 14:30-16:00
in science.
The Perspective Metaphor of Metaphor
Many accounts of the use of metaphor in science agree that metaphor is an important and perhaps essential component of successful scientific theory and explanation (e.g. those of Black, Hesse, Harre, Bradie, Ruse). One common explanation of why this is so draws upon an account of metaphor in scientific practice which is itself metaphorical. The use of metaphor, it is said, grants one a novel or interesting “perspective” from which to “see” an object, thereby facilitating experimental and conceptual approaches to further research. What are the implications of adopting this ‘perspective’ of metaphor? Can this metaphor be substituted for a more literal account? I suggest that this perspective of metaphor is a natural one as it bridges between the dual aspects of science as process and product. Examples will be drawn from the field of cell biology past and present.
Andrew Reynolds
Cape Breton University, Sydney, Nova Scotia, Canada
Session: IV.5 Room: Peter Chalk 1.3 Time: Friday 9:30-11:00
distance, developed to reconstruct the history of evolution and speciation, to comparisons between sexes within a species. First, I discuss the model-theoretic assumptions of comparative genomic estimates of genetic distance between humans and chimpanzees and describe the outstanding conceptual and interpretive debates in this field. Second, I demonstrate how the figure of “two percent” genetic difference between human males and females is an inflated one under any set of assumptions. This is in part because the figure relies on a simplified picture of human biology that overlooks basic questions about how to describe gene structure and function, including coding versus noncoding genes, expression profiles, and proteomics. Third, I argue that any quantitative estimate of genetic difference between human males and human females relies on a model of genetic difference that makes a comparison with genetic distance between humans and chimpanzees wholly inappropriate and misleading.
Hans-Jörg Rheinberger
Max Planck Institute for the History of Science, Berlin, Germany
On the Dynamics of Laboratory Research: Views on Molecular Genetics
A critical view on selected aspects of the history of molecular biology will help to understand research as an autopoietic process: Its main incentives are less encompassing theories but the constellations and contingencies created inadvertently along the way. If there is a driving force in this process, it is the introduction of new technology. It creates options for shaping new phenomena and concepts involved in their assessment. The paper intends to give an impression of the overall history of molecular biology and to develop its epistemic understanding on a case study base. The case study looks at the introduction of the notion of information in the molecular genetic work of Francois Jacob at the Pasteur Institute in Paris. At the same time the paper aims at understanding the changing historiographical trends that accompanied the rise an spread of the new biology, including the advent of gene technology.
Time: Saturday 9:00-10:30
Session: X.1 Room: Newman B Time: Saturday 14:00-15:30
Robert Richardson
University of Cincinnati, Cincinnati, OH, United States
Session: VIII.10
Room: Peter Chalk 2.3
Integration and Disintegration in Evolutionary Biology
One persistent theme in Richard Burian’s work is the tension between the particularity of biological work, focusing on the details of specific organisms in their natural environments, and the importance of theoretical integration, drawing on the interactions of various and surprisingly disparate disciplines. The portrait of biological work which results is characteristically complex: while the interactions are crucial, with varying contributions depending on the available methodological tools and the organisms that are suitably understood, the resulting integration falls far short of the reductionistic ideals that are often characteristic of both philosophical and biological work.
After wrestling with Burian’s portrayal of the tension between integration and disintegration within the Biological Sciences, I turn to specific cases within evolutionary biology. I will focus on some recent biological work on the evolution of troglomorphy, emphasizing the contributions that are offered by development, genetics and ecology to the understanding of the evolutionary phenomena, and the surprisingly complex phenomena that are involved. The case exhibits the very kind of complexity that Burian thinks is typical within biology.
Sarah Richardson
Stanford University, Stanford, CA, United States
Are Men and Women as Different as Humans and Chimpanzees?: Quantifying Sex Differences in the Human Genome
On completing the sequence of the X and Y chromosomes, researchers announced that male and female genomes differ by “two percent,” “greater than the hereditary gap between humankind and its closest relative – the chimpanzee.” On this basis, researchers have claimed that “the genetic difference between males and females absolutely dwarfs all other differences in the human genome” (Bainbridge 2003) and that the human genome, once presented as the common inheritance of humankind, is better conceived as “not one human genome, but two – male and female” (Connor 2005).
This paper analyzes the use of methods from comparative evolutionary genomics and human population genetics to estimate the amount of difference or “genetic distance” between human males and females. The paper will present a close methodological critique of the application of the concept of genetic
Session: I.4 Room: Peter Chalk 1.1 Time: Thursday 11:30-13:00
Session: II.10 Room: Peter Chalk 2.3 Time: Thursday 14:30-16:00
Marsha Richmond
Jason Robert1,2
Wayne State University, Detroit, Michigan, United States
1) Arizona State University, Tempe, AZ, United States, 2) University of Arizona College of Medicine - Phoenix, Phoenix, AZ, United States
Evolutionary developmental medicine
What are the prospects – conceptual, curricular, collaborative, and clinical, – for evolutionary developmental medicine? I engage recent and emerging experimental and theoretical research in evo-devo with particular attention to evolutionary aspects of the development of disease. I briefly discuss the evolution of developmental pathways and processes of disease, and focus on the mutual interrelations between the development and evolution of organism-disease systems. Beyond sketching a framework for answers to persistent questions about the origins and ontogeny of simple and, more interestingly, complex human diseases, I also briefly explore the organizational, curricular, and conceptual obstacles to evo-devo medicine.
Conflict, Controversy, and Gender in Early Genetics: Selected Case Studies
Disputes of various kinds are part and parcel of science practice and activities of the scientific community. Controversies often involve differences between individuals or groups over such issues as: method, introduction of new techniques or technologies, interpretation of phenomena, priority, and conflicting paradigms. Conflicts also arise concerning scientific research in areas of social concern, in which values, economics, and politics come into play.
In the literature considering questions involving controversy in science, little attention has been paid to the role that gender may play in affecting debates and disagreements. Yet gender (like race and religious confession) has almost certainly been a complicating factor in negotiating disputes in different arenas of scientific activity.
Session: III.10 Room: Peter Chalk 2.3 Time: Thursday 16:30-18:00
The discipline of genetics is a particularly good discipline in which to investigate this question. Genetics as a field of inquiry emerged after the introduction in 1900 of Gregor Mendel’s methodology for pursuing and interpreting hybridization studies. By 1910 it was widely accepted as a legitimate and promising means of studying problems of heredity. The field also attracted a significant number of women in the early years, largely owing to the increased pool of biology graduates resulting from the opening of academic study to women in the last decades of the nineteenth century. While historians have begun to explore the nature of the contributions women made to early genetics, to date we know very little about how women’s participation influenced various aspects of disciplinary development.
This paper will examine selected episodes involving controversies in the careers of two early women geneticists, Anne May Lutz (1871-1938), cytogeneticist at Cold Spring Harbor Laboratory for Experimental Evolution from 1904-1910, and Edith Rebecca Saunders (1865-1945), a leading member of William Bateson’s Mendelian research group at Cambridge University from 1895. The aim will be to determine the ways in which gender may have influenced the nature of the controversy, as part of a more general study of the history of women in the early disciplinary development of genetics.
Nicolas Robin
Friedich-Schiller-Universität, Jena, Germany
Discussing the “translation” of J. W. von Goethe’s knowledge of nature into scientific literature
for women
The aim of this paper is to discuss a “dilettante” format of the reception and diffusion of J. W. von Goethe’s theory of the metamorphosis of plants during the first part of the nineteenth century in Germany. The publica- tion of Botanik für Damen, Künstler und Freunde der Pflanzenwelt überhaupt, enthalten in einer Darstellung des Pflanzenreichs in seiner Metamorphose in 1828 by Heinrich Gottlieb Reichenbach forms the basis of our argumentation regarding the impact of this gender- related “translation” and standardisation of knowledge within a broad community of amateurs of botany. Our paper deals first of all with the images of Goethe’s experience of nature as well as with the relevance and implementation of his theoretical posing of the metamorphosis in the plant kingdom in the scientific literature. We postulate that Reichenbach’s mention of women, artists and friends of the plant kingdom in the title tells us more about a political and economical choice than about the aspiration of the author to reach a non- professional public. Indeed, a textbook with such a target group was not supposed to draw firstly the attention of professional botanists and, consequently,
should protect in a sense the author from the critics. In addressing his readers in this way Reichenbach acquired the freedom to prove his conception of the natural affinities of plants and to elaborate the theory of Goethe, the one who was recognised by the botanical community more as a poet than as botanist. In fact, Reichenbach’s publication cannot be compared with other classic books for dilettantes as, for example, Jean-Jacque Rousseau’s Lettres élementaires sur la botanique or August Johann G. C. Batsch’s Botanik für Frauenzimmer. In his book Reichenbach presented the foundations of plant morphology and systematics scientifically but used Goethe’s conceptions and “system” as a background. This approach was quite similar from the attempts of natural scientists like Heinrich F. Link (Philosophia botanica, 1798), Friedrich S. Voigt (System der Botanik, 1808) and others. In an overview of the works on the ‘natural method’ (Die natürlichen Pflanzensysteme geschichtlich entwickelt, 1840) Herrmann L. Zunck did not include the approach of Goethe. Nevertheless, among 26 theoretical textbooks like those of Antoine-Laurent de Jussieu and Augustin-Pyramus de Candolle we find Reichenbach’s work for dilettantes and, therefore, also Goethe’s conceptions. Apparently the botanist Zunck gave more attention to a book for amateurs as to the original text of Goethe. Thus, our paper aims to examine the details of such processes of standardisation of knowledge and their impact in the field of botany. On the borderline between knowledge and science the example of Reichenbach’s botany for women allows us to discuss the modes of reception and standardisation of scientific concepts through different formats of discourses and offers besides an outstanding field of epistemic and cultural-historical debates on the development of botanical knowledge.
Material Transfer Agreements And Policy Implications: Strategies For Research Materials In Biotechnology
Because reach-through provisions on patents proved too slow and uncertain for privatizing research, the biotechnology sector created material transfer agreements (MTAs). But up to now, MTAs do not rest upon codified legal statutes defining specific rights and obligations. Instead, reflecting freedom of contract, parties have wide discretion in the setting the terms of their agreements and tailoring them to their specific needs. MTAs have become the most common means to
impose prepublication review, disclosure restrictions, liability indemnification, limitations upon actual use, and reach-through provisions.
The challenges that MTAs pose for the conduct of science are grounding the tragedy of anti-commons. In this hypothesis, restrictions and reach-through provisions of MTAs can be so onerous, and yet dispersed throughout a population of claimants, that negotiations over research materials become prohibitive, with the science held hostage to a phalanx of property managers. The access to research materials is crucial for developing biotechnology. In a quid pro quo approach, two models may facilitate access to patented research materials: patent pools and clearing houses. But if research institutions lack of margin of manoeuvre for obtaining patented research materials, there exist three alternatives available: research or experimental use exemption, conventional one-to-one licensing, and compulsory licenses. But problem arises when the material transfer occurs before the provider files a patent application of it. MTAs with confidentiality provisions or trade secret contract may be the solution. If the owner of a research material refuses its access for research and development (R&D), we think that antitrust legislation can be applied.
The objective of this article is to discuss policy implications of MTAs for the European Research Area (ERA). We begin with European governance on MTAs. Then, we analyse MTAs and knowledge diplomacy. After that, we study cases related to MTAs in the United States. Before concluding, we review empirical findings and discuss strategies to accede research materials and cope with restrictive provisions in MTAs. Finally, norms emanated from a governance body to regulate the research material exchange are needed in the ERA.
Alfred R. Wallace and his vision of anthropology and evolution
During Victorian period, the vision of the humankind and the biological and cultural processes related with it were very homogeneous inside the scientific community, especially in the British scientific society. One of the most famous examples of this vision was Darwin’s one, who supporting his idea with the theory of natural selection considered that humankind was a result, in an integral way, of natural processes, such as natural
Session: VI.6 Room: Peter Chalk 1.5 Time: Friday 14:30-16:00
Victor Rodriguez,
Koenraad Debackere
Katholieke Universiteit Leuven, Leuven, Belgium
Session: VII.9 Room: Peter Chalk 2.5 Time: Friday 16:30-18:00
Juan Manuel Rodríguez Caso, Rosaura Ruiz Gutiérrez
National Autonomous University of Mexico (UNAM), Mexico City, Mexico
selection. At the same time, one of the most notable exceptions to this homogeneous vision was Alfred R. Wallace, who although is best known as co discoverer of the theory of natural selection, he had a lot of interests that made of him an exceptional and particular personage, in fact, an example of interdisciplinarity. Many of these interests put Wallace in a compromising position, for example his belief in spiritualism, but beyond his particular visions, he made many contributions to biology in particular and science in general. In this case, our interest is to show some of his contributions to anthropology, especially talking about social anthropology and the repercussions of the theory of natural selection in relation to human race. The presentation is going to be based mainly on works like Contributions to the Theory of Natural Selection, The Origin of Human Races and the Antiquity of Man deduced from the theory of “Natural Selection”, Human Selection, Man’s Place in the Universe, On the Varieties of Man in the Malay Archipelago, Darwinism, among others, all of them samples of the deep vision that Wallace developed around humanity, going beyond the rest of the scientific community and introducing himself in the life of the communities, giving to the anthropology of his time a new view about human groups, like in the case of the Malayans and other groups of the South-east Asia. The relevance of this presentation resides in the originality of Wallace anthropological proposal, since is very well known that descriptions he made about the people he found on his voyages to South America and the Malay Archipelago, were extraordinary in his time, since there were a different point of view that went beyond the academic circles, since he included in his descriptions not only biological elements, but metaphysical, social and political elements too. In general, we can say that this vision serves as an example of interdisciplinary work (or even transdisciplinary), since that the anthropological vision of Wallace puts together many of his different interests: sociology, evolution, anthropology, biogeography, etc., resulting in a unique idea, placing Wallace as an extraordinary example as a scientist capable to unite different fields of knowledge.
für Psychiatrie in Munich was perceived as internationally leading the field of psychiatric genetics. Amongst other institutions, the Rockefeller Foundation funded promising young researchers for post-doctoral studies at the Munich Institute. One of them was Eliot Slater, psychiatrist as well as student of Ronald A. Fisher, and part of a broader group of statisticians who applied their methods to the field of population genetics. Today, Slater is considered the founding father of psychiatric genetics in Britain. The paper will address the motivations, practice, and repercussions of Slater’s stay at the GDA; it will also reconstruct the broader views of British population statisticians on the Munich group of psychiatric geneticists.
Session: IX.6 Room: Peter Chalk 1.5 Time: Saturday 11:00-12:30
Building Bioinformatic Knowledge: Interlinking Social Networks and Producing a Valid Microarray Experiment
Bioinformatics brings together expertise in biology, computer science, and statistical methods, constituting new experts who are weaving tacit skills and knowledge networks out of multi-disciplinary circumstances. The evolution of biomedicine hinges upon bioinformatic processes, which transform incomprehensible data matrices into workable scientific information. This undertaking is necessarily contingent upon advances in information technology which are continuously expanding in order to serve ongoing developments in the life sciences. The post genomic era promises foremost to allow for understandings of human dispositions which are in turn expected to drive medical diagnoses and treatments towards increased personalization and predictability. The DNA microarray is a significant tool of this revolution, capable of reviewing thousands of genes at a time and scanning for potentially defective interactions. Many of the massive datasets generated by these devices are currently looming in repositories with latent, spurious, and often mistakenly classified information embedded within them. Although the first published microarray experiment appeared in October 1995, the technique was then still six years short of comprehensive publication standards. Since 2001, efforts to standardize this innovation in gene-expression detection have consisted of various entities working together to retain the promises of personalized medicine. The accumulation, storage, and mining of data remain multi-skilled endeavours bridging together different types of scientists who embody a diversity of scientific traditions. Bioinformatics ‘builds’ new knowledge bases out of these traditions through pragmatic and pioneering
Volker Roelcke
University of Giessen, Giessen, Germany
Population Genetics and Psychiatry in the 1930s: British Scientists and their Views of the Munich School of Psychiatric Genetics
During the 1930s, the group around Ernst Rüdin of the Genealogisch-Demographische Abteilung (GDA) at the Deutsche Forschungsanstalt/ Kaiser-Wilhelm Institut
Session: IV.7 Room: Peter Chalk 1.6 Time: Friday 9:30-11:00
Susan Rogers
McGill University, Montreal, Quebec, Canada
efforts, but also through disciplinary biases. For microarray experiments, this includes targeted information dissemination through forums such as meetings, workshops, conferences and listservs, as well as broader exchanges with government organizations, industry players, and academic institutions. A network of actively constructed objectivity is described amid the communicative and inter-connective nature by which standards are agreed upon and employed. This paper will explore how bioinformatic skills and knowledge are accumulated and exchanged by highlighting the genealogy of standardization processes undertaken in order to bring microarray experimentation into the realm of objective science.
Kitcher calls “explanation by unification.” However, Kitcher’s ideal of explanatory unification relies explicitly upon deductive inference, rather than mechanistic explanations. Because endocrinologists “cash out” the etiology of the relevant phenomena in terms of causal mechanisms, any philosophical account of explanatory models in behavioural endocrinology must accommodate this fact. In this paper, I demonstrate that the organization/activation model explains by unifying the mechanisms.
Session: X.7 Room: Peter Chalk 1.6 Time: Saturday 14:00-15:30
Nils Roll-Hansen
Multilevel Selection, Evolutionary Transitions, and Adaptive Complexity
The theory of natural selection (NS) has been formulated as encompassing a set of principles that dictate conditions for evolution by natural selection to occur. The principles apply to all levels of the biological hierarchy and this generates the problem of the units of selection. Parallel to this, NS has been extended to a multilevel theory with the Price equation being at the core of the theoretical frameworks. In addition, one of the central explanatory challenges of NS has been to account for the fact that the whole biological hierarchy itself is the product of evolution. This has led to the problem of explaining the major evolutionary transitions that brought about the living world to its present states. And that has involved a theoretical transition regarding the very problem of the units of selection: the properties that characterize evolution by natural selection have themselves evolved. Drawing and elaborating on work by James Griesemer and Samir Okasha, this paper aims at connecting Griesemer’s developmental approach to characterizing reproducers as units of evolutionary transitions, with Okasha’s multilevel account of evolutionary transitions into a single explanatory framework. The role of adaptive complexity is elaborated, bringing the notions of evolvability and robustness to bear on how to understand an evolutionary transition.
University of Oslo, Oslo, Norway
Session: III.2 Room: Newman C Time: Thursday 16:30-18:00
Alirio Rosales
University of British Columbia, Vancouver, BC, Canada
Wilhelm Johannsen’s concept of the genotype
The purpose is to reflect on the origins, nature and impact of Johannsen’s concept of genotype. The paper will focus on Johannsen’s bean selection experiment published 1903. From the problem-situation at the start as Johannsen conceived it in relation to de Vries, Bateson, Galton, Pearson, etc. Following changes in his thinking in the course of the experiment, issuing in his early - the biometricians said premature - publication in 1903. And ending with a discussion of the impact of Johannsen’s ideas in the following years, including his visit to the US in the winter of 1911-1912. I the course of the paper I will discuss earlier historiographic interpretations of Johannsen’s genotype concept by, for instance, Frederick Churchill, Ernst Mayr, William Provine, Lenny Moss.
Time: Saturday 9:00-10:30
Explanatory Models in Behavioural Endocrinology: Unifying the Mechanisms
The reigning model in theories of sexual differentiation in mammals and especially in human beings is the organization/activation model. According to this model, prenatal hormone exposure permanently organizes both the genitalia and neural circuitry, which are later activated by hormones released during puberty. The model purports to explain a host of phenomena, including sex differences in cognitive abilities, cerebral lateralization, and the etiology of homosexuality and transsexualism. It appears to be an example of what
Session: VIII.10 Room: Peter Chalk 2.3
Sylvia Rolloff
Washington University, St. Louis, MO, United States
Session: X.6 Room: Peter Chalk 1.5 Time: Saturday 14:00-15:30
Joan Roughgarden
Stanford University, Stanford, CA, United States
Optimality: Restoring Life to the Living
I applaud the use in ecology and evolutionary biology of optimality thinking both for single-organism traits using optimality models, and for organism interactions
using game theory. These approaches allow us to acknowledge the distinction between living and nonliving entities. Oceanic circulation may propel obedient water molecules by eddy diffusion and advection in any direction, but even tiny larvae, as smart sailors, choose their depths to hitchhike on specific currents that take them in the direction of food or home. Without optimalty thinking, a dust particle in the water does not differ from a dust-sized larva. Mechanistic thinking is death, optimality thinking is life. Optimality annoints the living with agency. The extension of optimality thinking to behavioral interactions using game theory, especially cooperative game theory rather than competitive game theory, portends to account for how complex social behavior, such as the production of conferences like this by the ISH, emerges from the interactive dynamics of living academics.
Time: Friday 14:30-16:00
landscape. We do not think about the words ‘genetic fitness’ and ‘adaptive landscape’ or even about what is closely associated with those concepts. Instead how we think about genetic fitness is structured by how we think about landscapes. The ‘how’ is what I consider non-propositional.
Non-propositional content turns out to be not content at all, but is the re-structuring of propositional content. Even if we try to eliminate a metaphor from a theory by paraphrasing its propositional content, we will miss the particular perspective we have adopted and most likely continue to hold when thinking about the phenomenon in a mature theory. Thus, a theory of metaphor can devised to support the ineliminability thesis.
Session: XIII.4 Room: Peter Chalk 1.1 Time: Sunday 11:00-12:30
Arun Saldanha
University of Minnesota, Minneapolis, United States
Session: VI.7
Room: Peter Chalk 1.6
Jennifer Runke
Thinking Populations Through Deleuze
Gilles Deleuze’s Diference and Repetition has been duly recognized as one of the most important treatises in continental ontology of the twentieth century. It can be seen as a summit in a long lineage of philosophy of biology, which runs from Aristotle and Lucretius to Spinoza, to Nietzsche, Bergson and Georges Simondon. Rigorously avoiding essentialism at every turn, it is clear that Deleuze’s conceptions of “difference in itself,” “intensity” and “the virtual” are useful to posing new light on individuation and evolution,. However, the precise status of population thinking within Deleuze’s ontology has not received adequate attention in the commentaries. This paper will argue that a Deleuzian philosophy of difference has to be more committed to explaining the ecological emergence of populations, especially those called “racial formations”. Particularly, the paper will argue for an ontological distinction between “individuals” and “populations,” a fortiori within the human species. The biophysical processes that coalesce a set of individual bodies into a geographical population are also what makes it impossible to talk of populations as simply individuals writ large.
University of Calgary, Calgary, Alberta, Canada
Towards an Adequate Theory of Metaphor in Biology
Many metaphors are considered to be ‘theory- constitutive.’ The ‘adaptive landscape’ and ‘genetic information’ are a few such examples. Boyd (1979) argues that theory-constitutive metaphors can and should be replaced with a literal paraphrase once the phenomenon is better understood. Others, notably Ruse (2000) and Bradie (1999), argue for ‘the ineliminability thesis,’ namely that theory-constitutive metaphors cannot be jettisoned from a theory without a loss of content even once the phenomenon is better understood. If the ineliminability thesis is right, then there should be some content that is uniquely expressed by the metaphor. In other words, metaphors should be found to contribute both expressible (i.e. propositional) content and inexpressible (i.e. non-propositional) content to a theory. What features would a theory of metaphor have if it were to allow for both types of content?
First, the theory would be pragmatic. On a pragmatic theory, the metaphor’s meaning is determined by the intentions of the speaker. This is important because the speaker can intend to communicate more than just propositional content.
Explaining what non-propositional content a speaker can communicate via metaphor presupposes the second feature a theory of metaphor ought to have. In particular, an explanation of metaphor should be thought-based, not word-based. Consider the adaptive
Session: III.10 Room: Peter Chalk 2.3 Time: Thursday 16:30-18:00
funding, stem cells research, and “No Child Left Behind” as the backdrop for school board meetings and the subsequent courtroom battles over evolution, creationism, and intelligent design, limiting a discussion of science education to navigating a curriculum process would be disingenuously myopic. It is in this context that defining “scientific literacy” in a meaningful way becomes the initial step toward establishing a working policy for science education. A problematic and interesting situation arises when we try to unpack the concept of “scientific literacy” within a democratic process wherein scientists are outnumbered by those with other backgrounds. Beyond the vagaries of turf wars and the politics of a college or university curriculum committee, we face the challenge of preparing students for meaningful participation in future local, state, and national policy debates while simultaneously responding to the demands of current policy decisions and public pressures to improve our educational process.
Dawn Sanders
Natural History Museum, London, United Kingdom
Private Letters, Public Discourse:
The botanical correspondence of Mary Treat and Charles Darwin
The proposed paper will examine how one woman, Mary Treat, excluded by gender from the scientific milieu, used a common form of social correspondence: the personal letter to enter into botanical epistemological discourses with Charles Darwin.
Session: VII.6 Room: Peter Chalk 1.5 Time: Friday 16:30-18:00
Glenn Sanford
Sam Houston State University, Huntsville, TX, United States
Session: II.9 Room: Peter Chalk 2.5 Time: Thursday 14:30-16:00
Educating Citizens: Scientific Literacy and Public Policy
The national media often reports on the state of public schools and the quality of our educational system in general. This coupled with debates over global warming, stem cell research, and the political influence of evangelical Christianity has vaulted the politics of public school science curricula onto the front pages. Although intelligent design and other “alternative approaches” have failed to gain ground within the mainstream scientific community, they have captured the popular imagination and garnered political support in a variety of areas. Whether one considers the hearings held by the Kansas Board of Education, the Kitzmiller v. Dover Area School District court case, or disclaimers in biology textbooks, questions of demarcation have arisen in a number of areas. What should be included within the rubric “scientific” has taken center stage in these debates. A prominent feature of these debates has been practicing scientists expressing frustration at their opponents’ misrepresentation of science and scientific practices. As opponents of evolution stress “teaching the controversies” “critical thinking,” and “critical inquiry,” the public is left to question why a scientist would ever oppose these seemingly beneficial practices. Beyond the vagaries of turf wars and the politics of a university curriculum committee, we face the challenge of preparing students for meaningful participation in future local, state, and national policy debates while simultaneously responding to the demands of current policy decisions and public pressures to improve our educational process. With debates over accountability,
Roger Sansom
Texas A & M University, Texas, United States
Why Gene Regulation Networks Are the Controllers of Ontogeny
We may know of no system that achieves a feat more difficult to engineer than that pulled off by an organism of reproducing a complex self in a natural physical environment. Insight into such a system is gained by understanding the controller of that system. In this paper, I set out to find the controller of the process of ontogeny. This requires both an investigation into the notion of control and the process and evolution of ontogeny. I identify a new extrinsic concept of design control and suggest that gene regulation networks to be the controllers of ontogeny.
Philosophers typically discuss the concept of control within the context of action theory. For example, in her investigation of how to apply reasons to animals, Hurley defines control as “the maintenance of a target value by endogenous adjustments for exogenous disturbances.”
I call Hurley’s notion of control causal control. Much of ontogeny involves negative feedback systems. All components of negative feedback systems can be seen as causally controlling the other components. This leaves us free to see processes in ontogeny as complex system of components mutually controlling each other, which is an important insight, but it does not allow us to uncover the controller of ontogeny and the insight that this might bring.
Some components that causally control a process have
been designed (or have evolved by natural selection) to carry out that role. Such components design control the process. This notion of design control has the potential to rank to what degree different systems design control a particular process. One of the components that causally controls a target value is the component that was most precisely designed to control that target value. That system is the controller of that process. The controller of ontogeny is the design controller of the widest range of processes in ontogeny. That system is the system that was most evolved by natural selection. I propose that the gene regulation network is the most evolvable and evolved system in ontogeny, and is therefore the controller of ontogeny.
cognitive capacities. However, recent papers in developmental psychology and neuroscience claim that evolutionary accounts of modularity are incompatible with the flexibility and plasticity of the developing brain. Instead, they propose cortical and neuronal brain structures are constructed through interactions with our developmental environment. David Buller and Valerie Gray Hardcastle contend that evolutionary accounts of cognitive development are unacceptably rigid in light of evidence of cortical plasticity. Using examples involving interruptions in normal developmental processes and traumatic cranial injuries, Buller and Hardcastle argue that the developing structure of the brain is both too random and too sensitive to external stimuli to be the product of a fixed genetic mechanism. They also claim that our genetic material radically underdetermines the physical structure and configuration of the human brain. There simply are not enough genes to program the intricate neuronal structures that are essential to cognition. I argue that neither of these arguments is persuasive. Small numbers of genes can function to determine diverse phenotypical outcomes through evolutionarily selected developmental systems. Cascading gene sequences enable the recursive application of particular genetic subroutines to contribute to many distinct developmental processes. Similarly, theories of modularity are not incompatible with the view that innate cognitive systems systematically exploit environmental regularities to guide the developing structure of the brain. Hence, stable environmental structures can be employed both to cue and calibrate distinct aspects (neuronal migration and muscle innervation, for example) of the developing brain. As a consequence, innate cognitive development is situated in an environment in which particular interactions are expected to contribute to developmental outcomes. This suggests that the anti-adaptationist implications of these developmental arguments should be rejected.
Session: X.3 Room: Newman D Time: Saturday 14:00-15:30
María Jesús Santesmases
Consejo Superior de Investigaciones Científicas, Madrid, Spain
Reproduction and Cell Cultures: Human Genetics and Prenatal Testing in the Baby-boom Era
During the 1960s and early 1970s clinical settings were the locus for introducing human genetics practices in Madrid. A small group of Spanish clinicians were trained in prenatal diagnosis techniques while human genetic tests also started to develop. While a paediatrician stated that Down Syndrome could be diagnosed by “a single glance”, a genetic service recently settled started to learn techniques on how to count chromosomes from blood samples for cytogenetic testing. For obstetricians, curiosity and strong interest for the foetus growing inside women womb remained and tuned to newly techniques for extracting amniotic fluid to look and identified foetus cells. While starting to culture those foetus’ cells, clinicians began to talk about prevention of diseases in new-borns by interchanging information with paediatricians applying the same techniques for diagnosing congenital disorders in children. The set of professional interests came across each other to put the basis for new cultures of human reproduction in the baby-boom era.
Session: I.4 Room: Peter Chalk 1.1 Time: Thursday 11:30-13:00
John Sarnecki
A Weimar Mongrel: The Debates in Biology and Art on Gender, Race, and Genes
The paper explores scientific debates on intersexuality and gender dichotomy in Weimar genetics, medicine and art. These debates, originating in genetics and the research into sex-determination were intervowen with the debates on race purity. It can be shown how different concepts of genes were linked to different concepts of sex/gender difference, and how the different
University of Toledo, Toledo, OH, United States
Developmental Objections to Evolutionary Modularity
Evolutionary psychologists argue that selective pressures in our ancestral environment yield a highly specialized and genetically fixed set of modular
Helga Satzinger
Session: III.9 Room: Peter Chalk 2.5 Time: Thursday 16:30-18:00
Wellcome Trust Centre for the History of Medicine at UCL, London, United Kingdom
gender concepts were linked to race politics and anti- Semitism of that time. Comparing the concepts of the geneticist Richard Goldschmidt to the concepts of the race hygienist Fritz Lenz. Lenz used Goldschmidt’s experiments of crossbreeding Lymantria dispar and producing “intersexual” animals to argue that miscegenation in humans inevitably would lead to degeneration of that group of people he saw as the most developed: “the Nordic race”. At the level of genetics it can be shown that two different gene theories were used in two different scientific and political concepts. One concept used a strict binary gender order, the utopia of a pure superior race, and a theory of constant and stable genes, as proposed by Thomas H. Morgan. The other concept saw female and male aspects present in one organism, thus allowing “intersexes” to occur, miscegenation was not a threat to the nation and the genes did not have to be stable all the time and in every context.
Against the background of this biomedical debate in the 1920s and early 30s three contemporary photomontages of the DaDaist artist Hannah Hoech will be interpreted as a radical comment and utopian integration of gender and racial differences to overcomes hierarchy-creating dichotomies.
Time: Friday 14:30-16:00
gene superfamily...” Though this indicates considerable diversity at the molecular level, the homologous superfamily aggregation “means that many findings for one type of channel can be generalized to the others,” (Hille, 2001) and that the Hodgkin-Huxley model is interpretable as an emergent unifier. Emergent unifiers, which require simplifications of a variety of sorts, represent an application of the types of heuristics discussed in Wimsatt’s writings on reduction, but with a twist: In the interpretation given them in the present paper, the heuristics are utilized to generate emergent rather than reductive explanations.
Session: X.10 Room: Peter Chalk 2.3 Time: Saturday 14:00-15:30
Henning Schmidgen
Max Planck Institute for the History of Science, Berlin, Germany
Session: VI.2
Room: Newman C
Living concepts? Georges Canguilhem and the History of Biological Concepts
Twelve years after his famous “Essay on some problems concerning the normal and the pathological,” Georges Canguilhem published a book length study on the history of a biological concept: the reflex. Inside of France, his “Formation of the reflex concept in the 17th and 18th centuries” (1955) contributed significantly to define and exemplify the “French style” of doing history of science. Outside of France, the book passed largely unnoticed. This paper presents and discusses Canguilhem’s history of the reflex concept with respect to its historiographical and epistemological implications. Special emphasis is placed on Canguilhems understanding of scientific concepts. Canguilhem defines concepts as threefold entities consisting of terms, definitions, and phenomena. Similar to Bergson, he stresses the connection between conceptual activities and other functions of organic individuals in their respective environments. As a consequence, biological concepts are tied to the biology of concepts. I argue that this seemingly circular structure is a major feature in Canguilhem’s historical approach to the history of the biological sciences.
Kenneth Schaffner
University of Pittsburgh, Pittsburgh, PA, United States
Theories, Models, and Equations in Biology: The Heuristic Search for Emergent Simplifications in Neurobiology
This paper begins with a review of some claims made by biologists such as Waddington and von Bertalanffy, and others, that biology should seek general theories similar to those found in physics, for example in Newton’s theory of gravitation and its elaboration in the Principia, some treatments of Maxwell’s electromagnetic theory, or thermodynamics, quantum mechanics, and relativity theories. I disagree with that view, and describe an alternative framework for biological theories as collections of prototypical interlevel models that can be extrapolated by analogy to different organisms. To exemplify this position, I look at the development of the Hodgkin-Huxley giant squid model for action potentials in detail. The Hodgkin- Huxley strategy uses equations, but in specialized ways involving heuristic approximations, to build their model, which is here viewed as an “emergent simplification.” Very current elaborations of the Hodgkin-Huxley model, including Hille’s, suggest that “The NA, Ca, and K families of voltage gated channels form a homologous
Session: XI.5 Room: Peter Chalk 1.3 Time: Saturday 16:00-17:30
paleontology – was forced to take a back seat in evolutionary discussions to the proclaimed superiority of population (genetics) thinking. Consequently, while only paleontology can provide a temporal overview of life, the taint of “gaps in the fossil record” overshadowed any contribution to evolutionary theory this discipline might make. Not surprisingly, therefore, while Dobzhansky (1937, 1941) and Mayr (1942) could discuss the origin of species, Simpson (1944) could only address tempo and mode. As Darwin had done, Simpson dismissed the “gaps” in the fossil record as taphonomic inconveniences and used them in his “quantum theory” as evidence of a smooth but rapid transformation of populations so small that their chances of preservation were slim to begin with. In his overview of invertebrate and vertebrate paleontology, however, Schindewolf (1950) interpreted these “gaps” as evidence of the non-gradual, non- smoothly transformational nature in which evolutionary novelty can emerge. Further, Haldane’s (1932) discussion notwithstanding, Schindewolf’s consideration of paleontology, comparative morphology, and development in the formulation of his stepwise theory was much more synthetic than the MS. Yet because Schindewolf was unabashedly anti-Darwinian, his presentation fell largely on deaf ears. Evidence of the weight of the MS and its adherents’ intolerance to real as well as perceived anti- Darwinian ideas is noted in the hostile reception given Eldredge and Gould’s (1972, 1977) model of punctuated equilibria. Although subsequently reconfigured by Gould, the original model was both overtly selectionist and Simpsonian in interpreting the fossil record as affording evidence of a smooth, albeit rapid, transformation from one species to another. Whether it was because Eldredge and Gould aligned their model with Wright’s notion of partial peripheral isolates (although it should have been with Haldane’s evolutionary model based on completely isolated peripheral populations), and the MS had marginalized Wright, or because the model of punctuated equilibria was not a model of gradual change, the slurs heaped upon the model and its authors echoed those earlier directed at actual non-Darwinians. The historical irony is that many of the alternatives to Darwinism, from the late 19th century on, are more relevant now to understanding the origin of novelty and thus potentially of species – as opposed to the persistence or survival of novelty and thus or species – than the dogma that sought to repress them.
Emily Schultz
St. Cloud State University, St. Cloud, MN, United States
Balinese Water Temples Revisited: Approaching Steven Lansing’s Balinese Ethnography from the Perspective of Constructivist Evolutionary Anthropology
Anthropologist Steven Lansing’s ethnographic work on the water temples of Bali has been used by David Sloan Wilson as evidence to support his theory of cultural group selection and by John Odling-Smee, Kevin Laland, and Marcus Feldman as evidence to support their claims about niche construction. I will argue, however, that these interpretations of Lansing’s achievements are thin and inadequate, and, from the perspective of constructivist cultural anthropology, miss entirely what Lansing has managed to achieve. Using material taken primarily from Lansing’s recent book Perfect Order (2006), I will argue that a more robust understanding of Lansing’s ethnographic achievements is attained when his use of complexity theory is situated within the broader commitments of contemporary constructivist cultural anthropology. By incorporating theoretical contributions of developmental systems theory and actor network theory, and drawing on insights by William Wimsatt and James Griesemer concerning the generative entrenchment of modular structures that scaffold cultural evolution, I will offer an alternative interpretation of Lansing’s work from
the perspective of what I call constructivist cultural evolution.
Session: I.2
Room: Newman C
Time: Thursday 11:30-13:00
Jeffrey Schwartz
University of Pittsburgh, Pittsburgh, PA, United States
Was the modern synthesis really a synthesis?
Although extolled by historians of science as uniting a diversity of biological pursuits through a language of population genetics-infused Darwinism, the Modern Synthesis (MS) left a less than sanguine legacy. As is evident in the theories of the Victorian saltationists (especially Mivart) as well as of de Vries, Bateson, and the early Morgan, the field of evolutionary biology was rife with debate and alternative thinking, which was effectively eliminated by the founders of the MS, who freely threw ad hominem barbs at those they branded as “anti-evolutionists.” Although Simpson is seen as one of these founders, the discipline he represented –
Session: V.7 Room: Peter Chalk 1.6 Time: Friday 11:30-13:00
Session: VIII.8 Room: Peter Chalk 2.6 Time: Saturday 9:00-10:30
Sara Schwartz
Astrid Schwarz
Open University of Israel, Raanana, Israel
Institute of Philosophy, Technische Universität Darmstadt, Darmstadt, Germany
The Nature of Competition and Competition in Nature
It was in 1991 that Evelyn Fox Keller published her ideas about the social connotations which infuse the theory of evolution by natural selection. She was not the first to point out this fact. But what Keller demonstrated was, not merely that the concept of competition imposes ideological connotations onto evolution theory, but also that the very use of this concept is erroneous. The error lies in the presupposition of a zero-sum-game; namely, that in challenging situations in which organisms find themselves, profit for one organism necessarily involves a loss for another.
The presupposition of zero-sum-game is, indeed, basically erroneous. Any solution which fits local conditions may undergo positive selection, whether accomplished through mutation and phenotypic variation or through the cooperation of individual organisms of different species – from lichens formed by an association of fungi and algae, up to true genomic union. Natural selection is commonly considered to be a consequence of phenotypic variation, differential fitness and heredity. However, differential fitness does not inevitably entail competition.
It is my intention to examine the extent to which contemporary evolutionary and ecological thinking is still influenced by the “competition” metaphor and by the assumption that adaptation and speciation are explained by conflict and competition even in such cases where this assumption is not openly an integral part of the theory’s premises. In my talk I shall deal with this in two ways. The first, intra-theoretical : surveying the rich literature about symbiosis which has been published during the last two decades and appraising whether this line of research has overcome the use of the metaphor of competition. The second, meta- theoretical : analyzing of approaches which neutralize metaphorical connotations by theoretical formalization, and examining how successful these approaches have been in avoiding the competition hypothesis.
Hybrids in ecology: putting things in place
Most studies on the production of scientific and technological knowledge have focused on the lab as a place, where instruments, epistemological things, and experimental systems are localised and originate from. In contrast, field sciences have attracted less attention even though they have an explicit practice of place and deal with appropriate techniques. I am interested in the differences and dynamics of field and lab sciences as different cultural territories in terms of their practices of place.
I will be looking at the oscillation and exchange between the two territories, the intersection of different concepts of place and changing modes of border traffic. Genetically modified organisms, or restoring a whole piece of nature (e.g. pit lakes) are entities that might be described in the terminology of hybrids. What happened for instance when the biotechnology laboratories were opened up and the whole society became a lab in order to implement genetically modified organisms? What happens to practices of place when strongly ecologically informed concepts like biodiversity or ecosystem functions turn from a conservation-related and locally oriented notion to global socio-economic considerations and even geo-strategic conflict lines between regions rich in biodiversity and countries interested in the use of it?
The tentative answer I propose is that the field sciences depend on the exchange between field and lab. Accordingly, theoretical as well as applied ecology would be characterised by commuting objects and people and a permanent exchange of methods between places of different degrees of closedness.
Session: VII.10 Room: Peter Chalk 2.3 Time: Friday 16:30-18:00
Norberto Serpente
The Wellcome Trust Centre for The History of Medicine at UCL, London, United Kingdom
‘The Visualisation of the Invisible in Cell Biology: The Use of Models Describing Cell Function
as a Consequence of the Molecular Revolution’ (1970-2000)
By the early 1980s a new type of visual representations used to portray the cell as the unit of life started to be deployed in articles and cell biology textbooks. During that period the number of images obtained with
microscopes decreased dramatically and visual models portraying molecular interactions became prominent. Soon after recombinant DNA technology and new approaches from biochemistry and immunology entered the field of cell biology in the early 1980s, a new kind of visuality was developed accompanying new experimental set ups, and creating new objects for investigation.
I start from the hypothesis that this new visualisation gave rise to a new set of problems for further biological research and thus developed its own dynamics. It created a sort of hyper-reality in the sense of Baudrillard, an enclosed universe of models without referents. However, it had contact with the experimental understanding of cellular processes and the shaping of new research questions.
To understand the development of this hyper-reality and its relation to the world of experiments I am keen to investigate two areas of biological enquiry that emerged or were reconfigured in those times: signal transduction and gene regulation. The very existence of these areas hinges on this new visuality and they had few connections with the former research objectives in cell biology.
with governmental support for agricultural experimental stations. Initially, most of the entomological research was confined within the framework of natural history. However, after World War I, the Japanese government began to reorganize agricultural research in order to increase crop yields. In 1921, chlorpicrin, the first domestic synthesized insecticide in Japan, was put on sale as a crop fumigant. Since chlorpicrin had been used as chemical weapon during World War I, the Japanese Army helped to manufacture the insecticide. The army also began producing hydrogen cyanide insecticide in the 1930s; this insecticide was diverted to be used as a chemical weapon in the early 1940s. During World War II, there was an outbreak of malaria-which is transmitted by mosquito vectors-in the Japanese Army. It is commonly known that while the U.S. Army used DDT to exterminate mosquitoes, the Japanese Army failed to manufacture this insecticide. However, I will show that after the 1940s, entomologists in Japan had shifted their attention to medical research due to the mobilization for malaria research. In summary, due to World Wars I and II, Japanese entomology shifted from natural history to becoming a mission-oriented discipline involving chemistry, biology, and medicine.
In my talk I will describe the representational shift in cell biology. In addition, I will present my findings on how the new kind of imagery used in textbooks was produced and what kind of criteria for the selection of images for publication was at stake. Doing so, I want to substantiate my hypothesis of a condition of “hyper- reality” produced by the new visualisation procedures.
Session: VI.4 Room: Peter Chalk 1.1 Time: Friday 14:30-16:00
Renard Sexton
University of Maryland, College Park, MD, United States
Session: I.11 Room: Peter Chalk 2.1 Time: Thursday 11:30-13:00
Public Policy Implications of Environmental Mechanisms
The concept of mechanisms has proven useful in the description of medical disease, scientific discovery, and neuroscience, among other things. Having never been applied to ecosystem disease, though, an entities-and- activities styled mechanistic view presents an intriguing and useful tool for environmental policy makers.
Estuary eutrophication provides an excellent example of ecosystem disease, with diverse and complex entities that combine to create a public policy challenge. Characterized by excess nitrogen and phosphorus disrupting normal nutrient flow patterns, this condition generates an overabundance of phytoplankton, and soon thereafter, extreme anaerobic conditions. These result in dramatic vertebrate die-offs, rising anaerobic bacterial and viral toxins, and plummeting underwater plant populations. This case can be used as an analogous model for developing and implementing ecosystem management techniques that also can be applied to other diseased systems.
Several major considerations allow this mechanistic
Akihisa Setoguchi
Osaka City University, Osaka, Japan
War and Biology: The Transformation of Entomological Research in Japan, 1918-1945
It is a well-known fact that the wars in the twentieth century changed the relationship between science and governments. World War I was the war of the chemists, and World War II had a great impact on physicists. On the other hand, biology, which made few contributions to the development of weapons, seems to have been impacted to a lesser degree by the wars. Moreover, Japanese science before World War II has attracted little attention because it is believed that Japan’s failure to mobilize its scientists led to its loss in the war. In this presentation, however, I will reveal that World Wars I and II radically transformed entomological research in Japan. In Japan, both entomology and biological research began in the 1870s
view to offer unique options and conclusions to environmental policy makers. Developing legal tests to define a ‘diseased’ ecosystem will streamline the treatment process, by providing a definite criterion that considers ecosystem evolution, and other confounding factors. By defining a ‘normal’ ecological mechanism, deviations can be more easily diagnosed and treated. Lastly, by combining the entities and activities to characterize mechanism function, specific intervention points can be evaluated to determine the most effective implementation.
Rule-making authority for environmental policy in the United States resides in the executive branch, in such agencies as the U.S. Environmental Protection Agency, and the Department of Energy. In contrast with the historical role of Congress and the Federal courts, the more streamlined rule-making process allows for greater efficiency. An improved methodology at that level can therefore easily direct resources toward the appropriate diseased function within a mechanism.
image), * is a convolution operator, and Ñ2G is a filtering operator that highlights sudden intensity changes in the image (G is a Gaussian and Ñ2 the Laplacian operator ∂2/∂x2 +∂2/∂y2). But why call the cells that detect the zero-values “edge-detectors”? After all, what they, detect is a property of the image of the retina, i.e., sudden changes in the “grey levels”, and not a property of objects in the visual vicinity. As Marr says: “Up to now I have studiously avoided using the word edge... The reason is that the term edge has a partly physical meaning – it makes us think of a real physical boundary, for example – and all we have discussed so far are the zero values of a set of roughly band-pass second derivative filters. We have no right to call these edges, or, if we do have a right, then we must say so and why”.
The aim of computational theories is to answer this why-question, i.e., to explain the inference from the detection of zero-values to representing boundaries of objects. The explanation, I maintain, is that there is correspondence between the mathematical relations between the representing entities, i.e., the zero-values of (Ñ2G)*I(x,y), and the mathematical relations between the represented objects, i.e., changes in light reflectance along boundaries of objects. Marr exhibits this correspondence in terms of the physical constraints that the environment imposes on the cognitive system. In the presentation itself I will expand on it, saying how it constitutes an explanation of semantic tasks in general and of edge-detection in particular.
Session: XI.8 Room: Peter Chalk 2.6 Time: Saturday 16:00-17:30
Oron Shagrir
Departments of Philosophy and Cognitive Science,
The Hebrew University of Jerusalem, Jerusalem, Israel
Marr’s computational theories revisited
My aim is to reassess and explicate the goals and structure of Marr’s computational theories. I suggest that computational-level theories aim to explain certain semantic tasks, and that they achieve this goal by exhibiting how mathematical relations between the representing entities correspond to mathematical relations between the represented entities. I will then suggest that the conceptual framework of computa- tional-level theories is logically independent of Marr’s top-down methodology which was seldom adopted in subsequent scientific research.
The goal of computational theories is not merely to describe but to explain. Marr writes: “The key observation is that neurophysiology and psychophysics have their business to describe the behavior of cells or of subjects but not to explain such behavior”. Computational theories, in contrast, do explain behavior. The behavior being explained by computational theories are certain patterns of information processing, in Marr’s terms: “a mapping from one kind of information to another”, or “mapping from one representation to another”. I call these patterns semantic tasks. Examples of semantic tasks are shape from shading, depth from disparity, and edge-detection.
Consider edge-detection. Marr describes the process as zero-crossings (detecting the zero-values) of Ñ2G)*I(x,y), where I(x,y) is the array of light intensities (retinal
Session: VI.9 Room: Peter Chalk 2.5 Time: Friday 14:30-16:00
Adam Shapiro
University of Chicago, Chicago, IL, United States
Textbook Authors and Textbook Salesmen: Contrasting Communities of Biology Knowledge Production
In the 1910’s and 1920’s the community of high school biology textbook authors was small, close-knit, geographically centered, egalitarian, collaborative, and self-aware of its role in shaping the future of a new pedagogical discipline. By contrast, the men who were responsible for presenting textbooks to the masses, cadres of textbook salesmen under the employ of rival publishers, were diffuse, hierarchical, combative and largely unconcerned with the content of the books they were peddling. Yet both of these groups constituted communities each united by interests and values. These communities contrast sharply in the content of those values, yet they overlap in one particular: a dedication to the creation and diffusion of biology textbooks. This paper explores how these two very different communities co-jointly created a new pedagogical discipline of high school biology.
As a particular example of how these two social networks function, this paper will examine the creation and controversy over Arthur Clements 1924 textbook, Living Things, An Elementary Biology. Clement was considered by other biology textbook authors to be an “outsider” and his book was controversial because of his previous role as Superintendent of Biology Instruction in the State of New York. The authors’ collective reaction to Clement’s book and the efforts of sales agents to position their books in relation to it illustrate the implicit norms and conflicting attitudes towards textbook production that these communities possessed.
Session: X.4 Room: Peter Chalk 1.1 Time: Saturday 14:00-15:30
Textbooks are frequently perceived as the artifact of a scientific community, whose social ideals may be embedded into the content of their products, this paper suggests a more complex view of textbooks, by showing them to be a hybrid creation of different constituencies. The values preserved in them are necessarily conflicting, often arbitrated at the point of intersection between authors and sales agents: the editorial desk. Session: V.4 Room: Peter Chalk 1.1 Time: Friday 11:30-13:00
Faculty of Philosophy, University of Oxford, Oxford, United Kingdom
Developmental Systems Theory as a Claim about Inherited Information
Developmental Systems Theory (DST) emphasises the importance of environ-mental resources in development. A common, deflationary reaction is that the causal importance of non-genetic factors in development has always been appreciated. This paper argues that DST can be reformulated to make a more substantive claim: that the special role played by genes – to carry inherited information – is also played by some (but not all) other developmental factors. This is consistent with Griffiths’ parity thesis: ‘Any defensible definition of information in developmental biology is equally applicable to genetic and non-genetic causal factors in development’ (2001, p. 396). However, it does require DST to give up its radical claim that every developmental resource can be treated as informational (Griffiths & Gray 1994). DST can then accept 20th century biology’s most important discovery, that something extraordinary is going on with genes, and then contribute the observation that that special property is shared by some other developmental factors. The key step is to show that the right way of understanding inherited information is equally applicable to genes and to some, but not all, non-genetic factors in development. Griffiths counts as an inheritance system ‘any biological mechanism which produces resemblances between parents and offspring’ (2001, p. 400). However, not all such factors are evolutionarily relevant. Morphogen gradients received from the mother provide positional information in the developing embryo and so act as a cause of parent- offspring similarity, but they are not evolutionarily relevant because variation is not heritable. Only heritable causes can have the evolutionary function of producing a particular phenotypic outcome. In this paper, I argue that the demands on being an inheritance system are even more stringent. Such a system must have the evolutionary function of producing heritable phenotypes (a meta-function arising over a series of episodes of selection of heritable phenotypes). DNA has this meta-function (hence maintenance and repair machinery), as may chromatin marks (if they turn out to be the basis of much heritable phenotypic variation) and imitation in apes; but most developmental resources do not. Interpreting DST as a claim that there is some non-genetic inherited information turns it into a striking, empirically-testable hypothesis, driving the sort of investigations discussed
Ayelet Shavit1,2
Nick Shea
1) University of California, Davis, Davis, CA, United States, 2) Tel Hai College, Upper Galilee, Israel
Location, Location, Location!
Negotiating Places and Perspectives in a Biodiversity Database
The last century ended with completing genomic databases for several species, which became an unexpected resource for producing new theoretical approaches and for integrating several inheritance systems. The 21st century begins with huge budgets being deployed to construct standardized, large-scale and long-term biodiversity databases of species distribution, in a similar hope to stimulate new theoretical integrations. Recording an organism’s locality is a mandatory first step for modeling its niche, its niche construction, and the inheritance of its modified niche. I will look into such a species distribution database recently assembled at the Museum of Vertebrate Zoology, Berkeley U.S.A. and unfold how the data-element of ‘locality’ was differently characterized at different times and for different purposes. This shifting ‘locality’ will illustrate the role of negotiation and coordination between different theoretical commitments in order to systematically collect data. At different times and for testing different models, being ‘systematic’ requires different norms of data collection and organization. I will illustrate how the history of a data field formation constrains some of the ways to mine a global biodiversity database for integrating evolutionary and ecological information.
by Jablonka & Lamb (2005). DST’s characteristic rejection of a gene vs. environment dichotomy is preserved, but without dissolving all potentially explanatory distinctions into a interactionist causal soup.
Time: Saturday 14:00-15:30
mechanisms. Both lines of research are closely tied with the downfall of the received view of theories, the way they are discovered and articulated, as well as their role in explaining and predicting phenomena. As philosophers of biology and the social sciences in particular have pointed out, theories are not deductive structures, nor do they contain domain-free and hence universal covering laws to serve the purposes of explanation. Indeed, practicing scientists in these fields do not seem to be interested in chasing such entities. Instead, they see their work as explaining phenomena by discovering underlying mechanisms that bring about these phenomena. These underlying mechanisms and their behaviour make phenomena intelligible or understandable to us. The way of achieving this goal is experimentation and manipulation, conducted, typically, by modelling and simulating the studied mechanisms.
In fact this pattern of explanation and understanding extends easily to all sciences, from physics and especially engineering to biomedicine, neuroscience, brain research, psychology and cognitive science. Thus, the fruitful strategy is to approach mechanisms and scientific understanding through the study of models. Indeed, in biology and cognitive science, most of the research is conducted by modelling mechanisms.
Models represent and embody mechanisms, but we suggest that they should also bee seen as epistemic artefacts: they function as tools for generating scientific understanding. We argue, by help of examples, that this approach is firmly based on the actual scientific practice in physics, biology and biotechnology (from ecology and evolutionary theory to molecular biology), biomedicine, neuroscience and brain research, as well as in psychology and cognitive science. Apart from being congenial to the way scientists see their practice, the approach through modelling mechanisms provides a fresh way to thematise many philosophical topics. Of those topics, we focus especially on the ways in which modelling mechanisms contributes to scientific understanding and facilitates integration of knowledge across disciplinary domains. But equally well, modelling mechanisms is a tool for generating potential answers to research questions and hence a way of accommodating a logic (or heuristic) of discovery in a unified picture of inquiry.
Session: X.8
Room: Peter Chalk 2.6
Rivers Singleton
Departments of Biological Sciences and English, University of Delaware, Delaware, United States
Early Concepts of Bioenergetics: Herman Kalckar, Fritz Lipmann, and Severo Ochoa
Eduard Buchner’s 1897 report of a yeast cell-free extract, which could ferment sugar to alcohol and CO2, lead to biochemistry’s disciplinary explosion. In the 20th century’s early decades, biologists from a variety of disciplines began to address fundamental questions about living beings with increasingly sophisticated chemical tools. Many questions concerned the energetics of biological processes, such as muscle contraction, and by the late 1920s numerous phosphorylated compounds involved in metabolic processes had been characterized. Other investigative lines demonstrated oxygen’s importance in these processes. In 1937, these two investigative lines came together with Herman Kalckar’s demonstration that O2 consumption and phosphorylation reactions were linked, a process later called “oxidative phosphorylation.” In later work Kalckar and Fritz Lipmann, independently, introduced the notion of a “high energy bond” and demonstrated that oxidation processes led to ATP formation. Ochoa demonstrated that the ratio of organic molecules phosphorylated to O2 consumption was constant with a value around 3.
In this paper I will explore ways that Kalckar, Lipmann, Ochoa, and others created a bioenergetics dogma that dominated biochemists’ work for more than twenty years, until finally overthrown by Peter Mitchell’s radical chemiosmotic vision.
Session: XI.8 Room: Peter Chalk 2.6 Time: Saturday 16:00-17:30
Matti Sintonen
Department of Philosophy, University of Helsinki, Finland
Session: VII.2 Room: Newman C Time: Friday 16:30-18:00
Scientific discovery, understanding, and the modelling of neurocomputational mechanisms
This presentation brings together two currents in contemporary philosophy of science: the nature and role of models in science, and the revived interest in
Daniel Sirtes
University of Basel, Basel, Switzerland
The Nexus, Mechanisms and Mechanism Families
In contrast with earlier conceptions of causal- mechanical explanation (Salmon, Railton) newer
versions of mechanistic explanation promote the concept of “a mechanism” as central. (Glennan 2005, Machamer, Darden & Craver 2000, Craver (forthcoming). A short assessment of these conceptions (Glennan’s two-tiered model and Craver’s conditions of explanatory relevance) and their comparison with the notion of the causal- mechanical nexus will reveal the arbitrariness of defining the phenomena to be explained and therefore of mechanism boundaries. Moreover, discussing the explanandum phenomenon of the action potential, it will be argued that the more general the explanandum is defined (where generality can be constructed along different dimensions) the more degenerate a notion of mechanism is at use. This will be followed by a plea for the indispensible role of pragmatics in delineating a mechanism in combination with the notion of mechanism families.
damage, compete with or prey upon more familiar “native” species, and present human health risks (directly or as pathogen vectors). How we evaluate the risks posed by a possibly invasive species is thus not only a function of its likelihood of establishment, but of the importance we assign to the relevant reasons for concern. My examination proceeds at two different levels. First, I examine a case of risk analysis done at the policy level, a recent study of potential invaders to the Galveston Bay region. I claim that the authors make implicit, substantial commitments to the importance of various concerns without explicitly discussing them. I raise further questions regarding their method of quantification of risk and whether their methods are effective. Second, I begin to explore more general questions about analyzing the risk posed by invasions – what sort of information do we need for such analyses to be effective, and is it possible to incorporate multiple sources of concern into a single analysis in a way that captures what we seek to measure?
In contrast with earlier conceptions of causal- mechanical explanation (Salmon, Railton) newer versions of mechanistic explanation promote the concept of “a mechanism” as central. (Glennan 2005, Machamer, Darden & Craver 2000, Craver (forthcoming). A short assessment of these conceptions (Glennan’s two-tiered model and Craver’s conditions of explanatory
relavance) and their comparison with the notion of the causal-mechanical nexus will reveal the arbitrariness of defining the phenomena to be explained and therefore of mechanism boundaries. Moreover, discussing the explanandum phenomenon of the action potential, it will be argued that the more general the explanandum is defined (where generality can be constructed along different dimensions) the more degenerate a notion of mechanism is at use. This will be followed by a plea for the indispensible role of pragmatics in delineating a mechanism in combination with the notion of mechanism families.
Evaluating The Risk Posed By Biological Invasions
Controlling populations of “exotic” organisms that become well established in a given geographical region can be both difficult and expensive. Some jurisdictions are making efforts to evaluate the risk posed by potential newcomers in order that preventative measures might be taken before they become established. These potential “invaders” might be deemed undesirable for several different kinds of reasons—they can cause economic
Session: I.7 Room: Peter Chalk 1.6 Time: Thursday 11:30-13:00
Christopher Smith
Session: VII.3 Room: Newman D Time: Friday 16:30-18:00
Victorian Physiology and Human Automatism
During the 1830s.Marshall Hall carried out innumerable experiments on a great variety of animals to establish the concept of a ‘reflex arc’. In France, F.L.Goltz showed that decerebrate frogs were still capable of complex behaviours. Thomas Laycock in England and Ivan Sechenov in Russia sought to apply the reflex idea to the brain. This paper follows the debate in the periodical literature of mid-Victorian England. Using the work of William Benjamin Carpenter as the major focus, it discusses the contributions of Herbert Spencer, Thomas Henry Huxley, William Clifford, Henry Maudsley, and others. The previous outing of this issue in the post-Cartesian seventeenth century had been largely suppressed by ecclesiastical authority. In the nineteenth century ecclesiastical power had waned, at least in England, and the debate could take a more open form. As neurophysiology and behavioural science developed, as acceptance of Darwinian evolution became more widespread and with the beginnings of psychiatry (Maudsley), it became more and more difficult to deny that brain and mind were part of the natural world and subject to the usual laws of cause and effect. This, of course, had powerful implications for those of a spiritualist persuasion and raised the ire of luminaries such as Sir Arthur Conan Doyle, Sir Oliver Lodge and Alfred Russel Wallace. It also had powerful implications for the human self-image and for jurisprudence. These
Nathan Robert Smith, Michael Trestman
University of California, Davis, Davis, CA, United States
Aston University, Birmingham, United Kingdom
implications are still with us and the work of neurophysiologists such as Benjamin Libet have only reinforced them. Should humans be regarded as ‘conscious automata’ and, if so, what becomes of ‘free will’, ‘responsibility’, and the rule of law? The Victorian debate is still useful and relevant.
Session: III.5 Room: Peter Chalk 1.3 Time: Thursday 16:30-18:00
Marianne Sommer
ETH Zurich, Zurich, Switzerland
Session: XII.11 Room: Peter Chalk 2.1 Time: Sunday 9:00-10:30
Natural Genealogies and the Objectivity of Approaches, Technologies and Objects in Molecular Anthropology
In 1962 Emile Zuckerkandl introduced the term molecular anthropology to characterize the study of primate phylogeny and human evolution on the molecular level. The occasion was the Burg Wartenstein symposium on classification and human evolution sponsored by the Wenner-Gren Foundation. Although what was thus named molecular anthropology can be traced back to the immunological studies of George H. Nuttall in the early twentieth century, the event and the succeeding Wartenstein symposium on the progress of molecular anthropology in 1975 were indicative of some consolidation. Papers on primate protein sequence analysis, immunological analysis, and DNA hybridization were combined in the 1976 volume with papers on primate paleontology and physical anthropology. The debates revolved around the issue of the right primate phylogeny and times of divergence, in particular the relation of humans to the anthropoid apes. Already at this point, arguments could be heard for the intrinsic superiority of the molecular data, even though those working on the molecular reconstruction of primate phylogenies far from agreed on the best approach, the meaning of structural and sequence differences for phenotypic evolution, or the nature of the molecular clock(s). In its approximately forty-year history, molecular anthropology has refined and diversified its technologies, methods, and objects, but it has not resolved the controversies. At the same time, central figures of the field, which now includes paleogenetics and -genomics in the reconstruction of phylogenies, continue to be overconfident and sometimes aggressively so vis-àà-vis physical anthropology and paleoanthropology. With the commercialization of the tools of genetic ancestry tracing similarly strong claims of scientific objectivity are made towards publics on which molecular anthropology depends as lawgiver, as data providers, and as service customers. The paper reconstructs an episodic history of molecular anthropology in contact with physical anthropology and the associated strategies of negotiation, appropriation, and attempts at takeover. Genealogies will be of three-fold concern: the assumption of a natural phylogeny that is preserved in macromolecules; phylogenies and pedigrees as commercial goods; and the lines of descent of molecular anthropologists. These genealogies are associated with a belief in the objectivity of numbers, logic, and
Tibor Solymosi
Southern Illinois University at Carbondale, Carbondale, IL, United States
From The Principles of Psychology To Dynamic Systems: The Influence Of Darwin On James, Dewey, And Cognitive Neurobiology
The American philosophy of pragmatism developed largely in response to Darwin’s theory of evolution by means of natural selection. While the significance of Darwin varies for each pragmatist, the influence of the theory of evolution on pragmatic approaches to the philosophy of mind is undeniable. Both William James and John Dewey develop functional accounts of mentality largely indebted to evolutionary insights. While the two positions are similar, there are noteworthy differences between the two. It is the aim of this paper to compare and contrast these evolutionary accounts of James and Dewey, emphasizing that significant differences amount to differences in the influence of Darwin on the particular philosopher’s thought. Consideration is given to James’s Principles of Psychology and his article “Does Consciousness Exist?” and especially to Dewey’s Influence of Darwin on Philosophy and his Experience and Nature. Particular attention is paid to Dewey’s criticism of James’s use of the reflex arc concept in psychology as well as the relationship between mind and body. Finally, attention is paid to contemporary issues in the philosophy of mind and neuroscience, inquiring into the influence of James and Dewey as well as Darwinian evolution. Final thoughts are given to dynamic systems theory (whose origins are found in Dewey’s 1896 article “The Reflex Arc Concept in Psychology”) and connectivist approaches more generally as found in the work of Walter J. Freeman, W. Teed Rockwell, the Churchlands, and Daniel Dennett.
mathematics, the objectivity of machines and instruments, and the objectivity seen to reside in the epistemic object itself – the anthropological gene as carrier of human history.
We will discuss the relevance of these concepts to the understanding of organogenesis and carcinogenesis.
Session: VI.1 Room: Newman B Time: Friday 14:30-16:00
George Mason University, Fairfax, United States
Ana Soto, Carlos Sonnenschein
Tufts University School of Medicine, Boston, MA, United States
Stalin and fighters against cell theory
In 1939, to celebrate the 60th anniversary of Comrade Joseph Stalin, Stalin Prizes were established in the USSR. Those who aspired to receive this award were meticulously selected by a special Committee, and after that the final long list of recipients was approved by Stalin.
But in 1950 an extraordinary event happened: by special decree of the Government the Stalin Prize in Science was given without the usual procedures and to only one recipient – Olga Borisovna Lepeshinskaya for her discovery of “Live Matter”. Her nomination for this award was made by Stalin. Then, the leaders of the Communist Party of the USSR ordered to support Lepeshinskaya and to condemn the “bourgeois” science of cell biology. At a special session of two Soviet Academies – the USSR Academy of Sciences and the USSR Academy of Medical Sciences – after after Olga Lepeshinskaya’s speech, 27 leading Soviet biologists agreed with her absurd claim to condemn Cell Theory. All research in this discipline was forbidden in the USSR, and many falsifiers presented fabricated ‘evidence’ in support of “new Cell Theory”. In accordance with this “Theory”, new cells may arise from non- cellular ‘live’ matter, and the well-established rule that any cell may appear only due to division of the maternal cell was rejected as as erroneous.
In the presentation, the story of this political intervention in science as well as the role of politicians and scientists who adopted the Communist Party order will be discussed.
Physicalism, diachronic emergence and downward causation in experimental biology
Based on the arguments presented by Paul-Antoine Miquel, we reject the thesis of the causal completeness of the physical world. Downward causation can be thought off without contradictions once we reject this assumption. We argue that diachronic emergence and downward causation are fundamental concepts for the scientific exploration and understanding of complex biological phenomena such as development and carcinogenesis.
Since the introduction of the cell theory, most biologists readily accept that the cell is the unit of life. In multicellular organisms, no single cell has an existence independent from the whole organism. These organisms and their cells are ontogenetically linked. This means that thinking about multicellular organisms as made up of cells that relinquished their independence is inaccurate. Rather, a zygote – a cell resulting from the union of a female and a male gamete – divides, producing more cells, which are organized in a tri- dimensional pattern. Both association patterns and cell types change as tissues and organs are formed. Hence, we propose to consider multicellular organisms as complex systems in which the relations among their parts are contextual and interdependent. This reciprocity makes it difficult to establish detailed cause and effect relationships. In addition, molecules are being sent from one cell to its surroundings. Such molecules residing outside cells are no longer a part of a cell and act on their immediate neighbors, or are carried into distant places through the bloodstream where they modify the activity of yet other cells, which in turn produce molecules that react with other cells, and so on. These facts reveal that these hierarchical levels are entangled, at times precluding experimentally isolated cells from revealing their full role in situ in the originating organism. But where is this entanglement coming from? The context of multicellular organisms is a product of history (evolution and ontogeny). We argue that this context-dependence is an effect of diachronic emergence.
Session: XII.6 Room: Peter Chalk 1.5 Time: Sunday 9:00-10:30
Session: XI.3 Room: Newman D Time: Saturday 16:00-17:30
Valery N. Soyfer
Maxi Stadler
Imperial College London, London, United Kingdom
Quantifying Excitable Tissues in the 1930s
A persistent motif in interwar and later, commemorative presentations of developments in electrophysiology between the world wars is that of a fundamental, if gradual, instrument-driven progress in the ‘electrical analysis’ of excitation phenomena in nerve and muscle. This was a case where a science
profited deeply from the amenities and requirements of modern living, in particular, in the form of the thermionic valve and advances in electrical engineering. Although 20th century neurophysiology has not generated a great deal of historical attention, a number of historians have picked up on this theme; electrophysiology indeed may paradigmatically illustrate the much laboured relevance of instrumentation to scientific practice and concept formation. The most recent, and in-depth studies available - cultural histories of the EEG - have much served to complicate readings of this specific case, but naturally are concerned with the latter, partly derivative technology rather than with the frequent substrate of technological expertise, the electrophysiology of excitable tissues.
This paper focuses on this most prestigous field of interwar research, in particular, a group of
physiologists moving between the centres of British physiology, Cambridge, London, and the Plymouth Marine Biological Laboratory. During the 1930s, a number of early attempts to treat excitation processes in a quantitiative and/or mathematical manner emerged from this group, eventually superceded by Cambridge physiologists Hodgkin and Huxley and their famous ‘model’ of the action potential in the early 1950s. Here, I aim to locate these earlier attempts in their scientific and cultural context. By the 1930s, research on bioelectrical phenomena was a wide and active field, stretching far beyond the realms of elite academic physiology. Interest in cellular excitability ranged from agricultural plant physiology, to pharmacology to medical applications of electricity.
Constantly being charged with indulging in a ‘useless’ and merely academic science of stimulating frogs, endangered by the rising science of biochemistry, and still engaged in a campaign of making medicine scientific while having to fight popular misconceptions and medico-electric charlatanism, natural science minded nerve physiologists had to steer a complex terrain. In conjunction with the local, and peculiar conditions of physiological training and research prevailing in Cambridge and London, my argument goes, these contexts are crucial for arriving at a more nuanced understanding of electrophysiology’s ‘instruments revolution’. While collaborative liasons with instrument makers served crucial purposes to control, technically and socially, increasingly complex apparatus and data production - as the Lancet noted in 1938, the ‘amplifier in neurophysiology proved to be something of a Pandora’s box’ - emphasis is given, too, to the widespread interest in electro-medicine and the nascent electro-medical instruments industry; both, as stimulating research problems and as mediating the transfer of electrotechnical expertise. As a tentative
conclusion, I suggest that early practices of quantitative modelling of neuronal processes are better conceived in relation to such and similar interwar concerns with the neuromuscular body rather than a (proto-)cybernetic brain/mind.
Room: Peter Chalk 1.1
Time: Friday 16:30-18:00
Beyond Atomism And Holism: Anti-Reductionism In The 20th Century
The rise of molecular biology through the 20th century has been a hugely popular subject for study within the history of science. From its origins in the 1930s through to the present day, molecular biology has seemed to hold indefinite promise as the darling of the biological sciences, with its potential in the fields of genetics and medicine trumpeted time and again. Whilst there is almost universal consensus that molecular biology has contributed a great deal of knowledge, at various stages there have been individuals who held the view that this discipline could not, on its own, be responsible for the purported ‘revolution’ in biological understanding. So- called biological anti-reductionism has received comparatively little attention within the history of science, despite the tightly-knit community which sprang up around this particular doctrine. Moreover, when anti-reductionism is discussed, the subtleties within this particular community have tended to be overlooked, thus obscuring the plethora of differing, and sometimes contrasting, approaches which were devised to halt the seemingly inexorable slide towards a progressively more mechanistic biology.
This paper analyses a cross-section of the anti- reductionist community by examining three individuals – Arthur Koestler, Conrad Hal Waddington and Ludwig von Bertalanffy – who, whilst all being committed to opposing molecularisation of biology, each developed their own distinctive brand of anti-reductionism. As such, whilst the anti-reductionists were united behind a common cause, there was fascinating and wide-ranging debate as to what constituted the most suitable and effective way to take biology away from its slide towards atomism.
In illuminating the variety of responses to reductionism, this paper also illustrates the myriad of factors which influenced the way in which Koestler, Waddington and von Bertalanffy formulated their approaches to anti- reductionism; as such, science and personal philosophy combined closely for all three. It is intended that this study will be extended to include a full appraisal of the
Session: VII.4
Jamie Stark
Leeds University, Leeds, United Kingdom
intellectual ties within the wider anti-reductionist community which, it is anticipated, will yield an intriguing insight into the reaction to the ongoing molecularisation in biology.
programmes. This similarity may motivate regarding protein synthesis as information-guided in the first place, although I leave open the further issue of whether instructing reduces to external and explicit determination.
Session: X.4 Room: Peter Chalk 1.1 Time: Saturday 14:00-15:30
Ulrich Stegmann
Session: XII.7 Room: Peter Chalk 1.6 Time: Sunday 9:00-10:30
Department of History and Philosophy of Science, University of Cambridge, Cambridge, United Kingdom
Joan Steigerwald
Against Causal and Informational Parity
Instrumental Reasoning in the Eighteenth Century
Instrumental reasoning has frequently been cast as one of the negative legacies of the eighteenth century on modernity, responsible for reducing nature to standardization, calculability and efficiency. But recent studies of science have examined the complex and diverse roles instruments play in mediating between our reasoning and nature. If at times instruments are used to bring discipline and regularity to natural phenomena, they also play a creative role in the production of knowledge, demonstrating both phenomena and theories and assisting us in figuring out their relationships. In my paper I will focus on the instrumental reasoning used to study irritability and muscular contraction in the second half of the eighteenth century, looking first at Haller’s investigations in the 1750s and then at galvanic experiments in the 1790s. The instrumental investigation of irritability in the 1750s at first promised a more precise conception of it. But the property of irritability was as much a product of its instrumental investigation as the muscles investigated. Moreover, the more exacting these investigations became, the more imprecise their results, in that the property manifested varied depending how they were investigated and who was carrying out the investigation. New experiments into muscular contractions in the 1790s demonstrated a similar embeddedness of conception and phenomena in the instruments of investigation. The galvanic apparatus used to study muscular contractions became indistinguishable from the phenomena investigated. Indeed, investigators like Ritter used instrumental definitions to distinguish galvanic phenomena from electrical or chemical phenomena. In other words, the instruments mediating between reason and nature became the basis of knowledge of irritability and muscular contraction. Philosophies of nature at the end of the eighteenth century, so often characterized as reducing nature to our reason actually had a sophisticated understanding of the complex process of the construction of knowledge of nature. Kant gave the schemata a fundamental role in our judgments of
Positive accounts of genetic information typically aim to provide a naturalistic theory of what is known as semantic or intentional information. This is a notion of information that allows for errors in a strong, evaluative sense, and is therefore often tied to intentionality as understood in the philosophy of mind. Intentionality is a broad notion, covering mental states that have truth conditions (like beliefs) as well as states having satisfaction conditions (like desires). The intentionality of genetic information is usually construed in terms of notions that involve truth conditions (especially meaning, representation and reference). Recently, however, its intentional aspect has been reconstructed by appeal to notions such as instructions and programmes, which involve satisfaction conditions. Error in the case of instructions and programmes arises from failures in implementation.
Customary examples of instructed processes, such as cooking by recipe, involve of a series of steps. The role of instructions is to specify in advance what steps will be carried out and in which sequence. On the instructional account of genetic information (Stegmann 2005), processes like protein synthesis are informational to the extent they are specified in advance. In this presentation I offer an account of ‘in-advance- specification’ according to which it amounts to a certain way of causally determining an outcome. It is constituted by two peculiar properties, which I call ‘external’ and ‘explicit’ determination.
External and explicit determination is important for two reasons. First, these features single out a small class of biochemical processes from the large majority. Moreover, the class of processes singled out happens to include the paradigms of informational processes (protein synthesis and the like). Thus, in contrast to the causal parity thesis, I maintain that the allegedly informational processes (or molecules) differ causally from the non-informational. Second, the structural difference in determining outcomes is also found in man-made devices. Devices operating by explicit and external determination are the ones operating on stored
York University, Toronto, Ontario, Canada
experience. A product of the transcendental imagination, the schemata was a means of linking the sensory intuitions and concepts. Novalis related the schemata as an imaginative instrument of judgment to the instrumental reasoning used in galvanic experiments. Schelling also argued that whether phenomena was to be understood mechanically, chemically or organically depended upon our methods and instruments of investigation.
nodes are biological objects (e.g. protein sequences) and whose edges are particular biological relationships (e.g. functional interaction in a mechanism, or evolutionary heredity). There are theories of alignment in computational and systems biology that are realized as computer algorithms which predict edges between nodes in the network as a function of the pre-existing network; the feedback loop is completed when experimentalists update the network with empirical results they discovered as a consequence of the theoretical predictions. The theories of alignment depend on sometimes implicit evolutionary arguments of positive or negative selection, as well as drift within a neutral region of sequence space. Thus “alignment” forms a new meeting ground between population genetics and descriptive biology.
Session: VIII.1 Room: Newman B Time: Saturday 9:00-10:30
Kim Sterelny
VUW-ANU, Wellington, United Kingdom
Moral Nativism: A Sceptical Response
Moral nativism is newly popular. The key idea is that the human capacity to moralise is a developmentally entrenched evolved adaptation. The case for moral nativism is often formulated with language as an explicit model: so it is claimed that human moral cognition shows restricted cross cultural variation; that it develops early and with a predictable sequence and that it would present an intractable learning problem to the unprepared mind. However, our minds are prepared: we come pre-equipped with moral information, though nativist views vary about the nature and extent of our built-in capacity. This paper presents a sceptical response that draws models of both of cultural evolution and of epistemic niche construction. To the extent that moral cognition, is universal, I shall argue that it is so because humans develop in highly prepared environments, not because their wetware is specially adapted for moral thought.
Session: X.2 Room: Newman C Time: Saturday 14:00-15:30
Session: IV.3 Room: Newman D Time: Friday 9:30-11:00
The Construction of a Developmental Niche: a means for phenotypic plasticity
Inheritance systems allow for the reliable transmission of crucial information from parents to offspring. The reproduction of the developmental system is the result of the reliable provision of a wide range of developmental resources necessary to reconstruct the organism’s life cycle. Organisms have developed a range of strategies to manage aspects of their own or their offspring’s developmental environment to guide the developmental process. Part of this developmental niche is the complex network that regulates the time- and tissue dependent expression of genes. The maternal RNA, cytoplasmic chemical gradients that are inherited with the mother’s egg and the chromatin code of imprinted genes are just the beginning of a myriad of processes of maternal effects and rearing practices that will continue to influence gene expression levels and other developmental processes until adulthood. This control of the next generation’s developmental environment via this process of ontogenetic niche construction provides important means for phenotypic plasticity, which confers both robustness and flexibility upon organisms.
Beckett Sterner
University of Chicago, Chicago, IL, United States
Reconnecting Evolutionary And Descriptive Biology: A Network Effect In Systems Biology
I describe a novel network effect emerging at the intersection of the evolutionary synthesis, descriptive molecular biology, and the – omics initiatives. The effect is the growing utility of sequence databases to scientists as they contribute more and more sequences, and it is dependent on a feedback loop between theoretical inference and empirical investigation. The key technical innovation driving the feedback loop is the concept of “alignment,” for example of protein sequences. Via alignment, scientists can constitute networks (often represented as hypertext links on the Internet) whose
Karola Stotz
Indiana University, Bloomington, United States
Session: V.1 Room: Newman B Time: Friday 11:30-13:00
of molecular sequences that started back in the 1960’s. By using molecular characters (protein and nucleic acid sequences), as well as statistical tools, the new evolutionists aspired to construct “true” phylogenies, free of the non-objective character they attributed to traditional, morphology-based systematics. However, it became increasingly clear that the new statistical analysis of molecular data was permeated by mathematical and empirical assumptions, and thus it was not as “objective” as molecular evolutionists would like. A myriad of methodological problems have plagued the field since the 1970s, one of the most visible being the parsimony-likelihood debate.
What we went to do in this paper is to trace the continuity of statistical tools and methodological problems that have been imported from molecular evolution to comparative genomics. The annotation of genomes and the comparative analysis of genomes make use of many of the statistical tools and methodological commitments made by molecular evolutionists. With our analysis we aim to modulate the claims of “discontinuity between molecular evolution and comparative genomics. Moreover, we will incorporate actors that had been left outside the picture, namely, evolutionists.
Room: Peter Chalk 1.6 Time: Saturday 14:00-15:30
Bruno J. Strasser
Yale University, New Haven, CT, United States
Natural History in the Genomic Age? The Making of GenBank, 1982-1987
The rise of experimentation and the decline of natural history have structured most narratives about the history of the life sciences from the 19th century to the present. Yet, with the advent of genomics, the constitution of complete collections of biological data has become, once again, a major practice in the life science, as it had been in the previous centuries. This paper will explore to what extent genomics can be understood as a modern continuation of natural history. It will examine the epistemic values, the material practices and the moral economies attached to this way of knowing. This paper will focus on the history of GenBank, the largest database of genetic sequences in the world, which was created at Los Alamos in 1983. The debates about the scientific legitimacy of this project, its epistemic objectives and its relationship to experimental knowledge provides a unique window to look into the historical continuities and discontinuities in the ways of knowing adopted in the life sciences over the last two centuries.
Session: X.7
Edna Suárez1,2, Víctor-Hugo Anaya3
1) UNAM, Mexico City, Mexico, 2) Max Planck Institute for the History of Science, Berlin, Germany, 3) Institute for Theoretical Biology, Humboldt University zu Berlin, Germany
T.H. Morgan’s Multiple Agendas Related to Regeneration
Around the turn of the twentieth century many embryologists were optimistic. New technologies, institutions, questions, and experimental approaches provided ways to investigate previously elusive questions about development. Many of these new experiments involved highly invasive techniques, which disrupted the normal course of development. Thomas Hunt Morgan was one such optimistic embryologist, but he was also concerned about the implications of these disruptive studies. He wondered, if these were indeed revealing something informative about the normal course of development. Skeptical of most existing theories of development as well as their supporting evidence, Morgan set out to clarify confusion and resolve contradictory results and conclusions. He based his argument on the assumption that studying regeneration would provide a window into developmental processes that was not as problem-prone as other more invasive techniques. From 1897-1904 Morgan wrote over 30 papers and a book on a wide variety of naturally occurring regenerative phenomena.
Evolutionary Tools And Comparative Genomics: Continuity In The Shadow
Scientific rhetoric is one of the driving forces leading the “state-of-the-art train” in Biology. Scientists promoting new areas of research seek to distance themselves from previously existing disciplines. Genomics has not been an exception to the rule, presenting itself as a new integrative form of biological knowledge, aware of the complexity of organisms, and highly critical of the deterministic, lineal explanations of molecular genetics. Nevertheless, against the claims of discontinuity, and the promises of theoretical breakthroughs, the continuity of practices and tools between genomics and previously existing fields is still recognizable.
The field of molecular evolution, and in particular molecular phylogenetics, profited from the accumulation
Mary Sunderland
Session: V.1 Room: Newman B Time: Friday 11:30-13:00
Arizona State University, Tempe, AZ, United States
This work convinced him that regeneration accurately modeled early development. From today’s perspective we take for granted that studying regenerative processes will teach us something about the mechanics of basic development; this assumption however, was established largely by the efforts of Morgan who saw regeneration as window into development. He made this argument clearly in his 1901 book, Regeneration, wherein he surveyed a wide variety of experiments on regenerative phenomena. This talk will explore some of the work Morgan presented in Regeneration (1901) to challenge existing theories of development. Morgan examined the experimental work of others to set the context for his own studies of regeneration. The evidence from his experimental work challenged existing theories of development and regeneration and provided the foundation for his “tensions theory” of regeneration, which he presented as a working hypothesis to inspire further work. Although the immediate and long-lasting impact of Regeneration on the scientific community has been questioned (Wallace, 1985), I will show that it contains three contributions of great significance: 1) a detailed taxonomy of regenerative processes; 2) the notion that regeneration models development; 3) the assertion that regeneration is a fundamental property of organisms.
“developmentalist” approach to the characteristic distributions within the model of morphospace. (ii) In ascribing functional properties to biological items, we appeal directly or indirectly to evolutionary considerations in their ‘backward-looking’ sense. Thus, according to the prevailing direct appeal, the function of biological item is the effect produced by the item, which was selected by certain evolutionary pressures in the past. However, the ‘selected-effect’ account of functions, and in general those accounts that refer to this type of restrictions, incorrectly deploy evolutionary considerations. For instance, the evolutionary pressures of natural selection can only track the biological item that functions better in a given range of environmental circumstances than the item- competitors, but not its function (for this, and other similar objections to the ‘selected-effect’ accounts, see Cummins [2002]). Now, I will argue here that the proposed assessment of the phylogenetical model of morphospace provides a more adequate scientific resource to the ways in which we can relate philosophical accounts of functional claims to evolutionary analyses.
REFERENCES
Alberch, P. ([1982]), “Developmental Constraints in Evolutionary Processes”; Amundson, R. ([1994]), “Two Concepts of Constraint: Adaptationism and the Challenge from Developmental Biology”; Cummins, R. ([2002]), “Neo- Teleology”; Griffiths, P. ([2002]), “Molecular and Developmental Biology”
Session: III.11 Room: Peter Chalk 2.1 Time: Thursday 16:30-18:00
Predrag Sustar
Session: IV.10 Room: Peter Chalk 2.3 Time: Friday 9:30-11:00
Department of Philosophy, University of Rijeka, Rijeka, Omladinska 14, Croatia
Bartlomiej Swiatczak
Functions In The Morphospace
I will examine (i) the phylogenetical model of morphospace, and (ii) its utility in the philosophical debate on functions in biology. (i) In phylogenetical analyses, the model of morphospace is usually considered as a suitable theoretical device for studying the interface of functional and evolutionary morphology. Thus, according to this model, the basic distribution is concerned with the relationship between observed organic forms and all possible forms, which has a characteristic value for a given morphotype (Alberch [1982]). Besides clarifying certain specific features of this kind of model building in biology, I will pay special attention to apparently conflicting strategies by which distributions of organic forms in the morphospace are explained (Amundson [1994]; Griffiths [2002]). With regard to the so-called “adaptationist” and “developmentalist” explanatory strategies, I will focus on different roles that the mechanism of natural selection plays in these explanations. In that respect, it will be particularly relevant to our purposes in (ii) to determine a certain stabilizing role of natural selection in the
SEMM, European School of Molecular Medicine, Milan, Lombardy, Italy
Natural Selection and the Problem of Reduction in Life Sciences
The main sources of knowledge on evolutionary events are chronologic documents, geographic distribution of populations and the results of comparative studies. Although these methods allow discovering historical facts, they cannot be treated as fully reliable approaches to the evolutionary processes occurring at present time. Moreover, the methods do not allow following reliably the mechanisms lying at the bottom of phylogeny. The investigation of the contemporary evolutionary processes and their mechanisms require methodical observation of the gradual changes within a living group of organisms. The results of this kind of investigation suggest that it is basically possible to distinguish two relatively independent biological levels, which are the subject to evolutionary changes: a molecular level and a physio-morphological level. There are reasons to believe that these two ontological domains are quite autonomous. The changes occurring at the two levels are generated by different causes and
may have different phylogenic consequences. If it turned out that there are really two different levels of organization that can influence each other and that they are the subjects to different causal relationships, it would have very important philosophical and methodological consequences. There has been a discussion among philosophers if physiological and morphological studies are reducible to analyses in molecular terms. This discussion has been carried out not only within philosophy of biology but also philosophy of mind and philosophy of science. For example Kim was trying to show that believing in the causal relevance of supervenient properties denies the theory of causal closure of the physical world. However, evolutionary biology provides us with examples proving the existence of different relatively independent levels of reality and it allows us to understand the nature of the relationship between these levels without necessarily rejecting the theory of causal closure. I will argue that examples of evolutionary events on micro and macro levels can be regarded as an empirical proof that it is basically possible to distinguish at least two irreducible levels of biological organization which are not only involved in quite different causal relationships but also can influence each other.
Room: Peter Chalk 1.5 Time: Thursday 16:30-18:00
years. However, there are some more results from molecular biology which puzzle morphologists, for example the finding that enteropneusts and not pterobranchs are basal hemichordates, or that tunicata and not acrania are the sister group of the craniota. These results are in conflict with the majority of traditional, morphology based models and do especially affect our understanding of deuterostome anagenesis. The discussion of the anagenetic aspect has to be completed by considering remarkable findings from evo/devo-approaches, especially the hypothesis of a polysegmented ur-deuterostome. Gee (2001) was one of the first authors who characterized the new view on deuterostome phylogeny by assuming a segmented precursor with gill slits, a hypothesis which is still in discussion. The aim of the talk is to show that not only the cladogenetic Chordata+Ambulacraria-distinction, but also the view of a polysegmented ur-deuterostome can be traced back to lesser known historical precursors, which anticipated nearly all characteristical details of the “new” molecular findings (for example the derived position of the Pterobranchia).
Tareq Syed
Morphisto Evolutionsforschung und Anwendung GmbH, Frankfurt am Main, Germany
Anagenesis and cladogenesis in deuterostome evolution: Well-known molecular phylogenies and well-forgotten morphological models
According to the so-called “new animal phylogeny” (Adoutte et al. 2000), the Bilateria consist of three superphyla Deuterostomia, Ecdysozoa and Lophotrochozoa. Only Deuterostomia have already been proposed by traditional morphology, although it remained controversial which phyla have to be included in this taxon. For example, the Tentaculata (syn. Lophophorata) were usually associated with the Deuterostomia, until results from molecular systematics rejected this assumption. According to molecular systematics, the Deuterostomia include six main groups (Craniota, Acrania, Tunicata, Hemichordata, Echinodermata, Xenoturbellida), which have to be separated into two taxa (Chordata and Ambulacraria). The cladogenetic Chordata + Ambulacraria-distinction has historical precursors in traditional morphology, it can be traced back to Metschnikoff 1881. Thus, the cladogenetic result of the new deuterostome phylogeny appears to be unproblematic for morphologists and has become a well-known textbook scheme over the past few
When is genetic analysis useful and sustainable? Perspectives on some new and old debates about genes and environment
This talk stresses continuities, not discontinuities in the era of genomics. Despite the rapid expansion of genetic information, longstanding problems persist, which date from the origins of classical eugenics in the nexus of agricultural improvement and eugenic aspirations. I identify key problems in three areas. These can serve as entry points into the social construction of the lack of knowledge and inquiry in areas that, I argue, would be needed to make genetic analysis useful and sustainable.
a. High human heritability estimates have given warrant to searching for specific genes contributing to behavioral traits. I note that no genes or measurable genetic factors (such as, alleles, tandem repeats, chromosomal inversions, etc.) are examined in deriving heritability estimates, nor do the methods of analysis suggest where to look for them. Indeed, even if the similarity among a set of close relatives is associated with similarity of yet-to-be-identified genetic factors, the factors may not be the same from one set of relatives to the next, or from one environment to the next. The discussion here also notes continuities with the close relationship between agricultural, evolutionary
genetics, and eugenics in the early 20C and with the
Session: IV.1 Room: Newman B Time: Friday 9:30-11:00
Peter Taylor
UMass Boston, Boston, MA, United States
Session: III.6
analysis of Lewontin (1974) in which he concluded that the parameters of most interesting models of evolutionary genetics were unmeasurable.
b. To suggest that recent statistical analyses showing environmental modulation of genes affecting human behavior are not very useful, I draw attention to the error bars, which means that any pre-symptomatic genetic diagnosis would result in costly misallocation into vulnerable versus safe categories. The discussion here contributes to questioning whether genome-typing can be useful for personalized therapy or for public policy (in a C21 version of eugenics or social hygiene).
c. Although data collection for genetic type is cheaper than for environmental exposures and easier to store, longituidinal epidemiological analyses need both. So I ask how can longituidinal environmental data be sustained in this era of genomics and of troubled State support for public health? The discussion here builds, among other things, on the concern that the tools (techniques, organisms) have, since the early C20, shaped what counts as studying heredity as much as the original questions— and continue to do.
to a modern age of veterinary medicine, I want to show the co-existence of different ways of understanding and treating animal diseases in the late XVIIIth century in France. Thereby, my objective is to provide evidence for John Pickstone’s concepts of ‘ways of knowing’ and ‘ways of making’ and revisit Michel Foucault’s notion of ‘epistemic break’ as illustrated in The Birth of the Clinic.
Second, I want to examine the ways in which the expertise on rinderpest was constructed. My aim is to show that not only physicians but also politicians and clergymen (especially country priests, who played a seminal role in popularizing knowledge to peasants) were committed to Vicq d’Azyr’s approach. Then, I argue that Vicq d’Azyr’s expertise matched not only a more scientific outlook within the veterinary knowledge, but also the physiocratic ideas, which dominated the political agenda of that period. At the same time, I show how Vicq d’Azyr’s approach was also an attempt to downplay the tacit knowledge of blacksmiths, to eradicate charlatanism and to control and discipline the allegedly ‘ignorant’, ‘superstitious’ and ‘dirty’ peasants.
Finally, I show that facing the issue of eradicating a disease ‘without a cause’, and eventually adopting the method of slaughtering infected animals, Vicq d’Azyr paradoxically resorted more to a religious discourse (of purification and sacrifice) than to a scientific one.
Session: II.6 Room: Peter Chalk 1.5 Time: Thursday 14:30-16:00
Marion Thomas
REHSEIS (CNRS-Univ. PARIS7), Paris, France
Science, Religion and Politics:
the Construction of Expertise on Cattle Plague in Pre-revolutionary France
Many accounts on French veterinary medicine acknowledge a transformation of veterinary medicine into a scientific profession in the late XVIIth century, and emphasize the concern with the eradication of animal diseases as a pivotal factor in this trans- formation. For instance, the campaign waged by the French physician and anatomist Félix Vicq d’Azyr (1748-1794) against the rinderpest outbreak in south western France in 1774-6 contributed to the flourishing of French Royal veterinary schools, but also, in 1778, to the founding of the Parisian Royal Society of Medicine.
My paper undertakes a critical reapparaisal of such claims and investigates further the role of Vicq d’Azyr in the transformation of French veterinary medicine. My first aim is to show that, while Vicq d’Azyr promoted himself as a pioneering figure of modern medicine, by insisting on a pathological anatomical approach, and arguing for a new medical language, he nonetheless relied on different discourses on diseases, which rooted back to Hippocratic medicine, as well as to a ‘natural history’ approach of diseases (nosology). Then, rather to argue for a sharp break from an ancient
Session: VIII.8 Room: Peter Chalk 2.6 Time: Saturday 9:00-10:30
Potthast Thomas
Interdepartmental Centre for Ethics, Tuebingen University, Tuebingen, Germany
Epistemic-moral hybrids:
discussing ethical normativity in the context of environmental interdisciplinarities.
A case study of the federal nature protection agencies in Germany 1906-2006
The history of nature conservation to a large extent appears to be a history of the tensions and interplays between nature as being constructed by science and nature as a source of values to be understood in terms of aesthetics and politics. In the paper, I shall provide a case study of governmental nature protection in Germany from 1906 to 2006, showing the productive dialectics as well as the apories of ecological science as the foundation of state interventions in dealing with nature.
The (Prussian, later: German) State Agency for the Conservation of Natural Monuments (later: Nature Conservation) was directed by biologists since its
foundation in 1906 until the 1960s. By focussing on the scientific inventory of biological and geological goods to be protected, they set the tone for a complementary strategy to the Heimatschutz movement, which initially was more than sceptical about science, since the latter was identified as the ‘mechanistic’ enemy of a aesthetic and political perspectives on nature. It was the state institution together with its local co-workers as well as universities, which pushed a specific vegetation science approach, “plant sociology” (Pflanzensoziologie) on the forefront not only within conservation but also in academic plant ecology in Continental Europe. It shall be noted the biologists agenda in itself often was part of cultural agendas for creating a ‘bridging science’ between natural science, anthropology and the humanities. This integration changed its shape and approach from “Heimatkunde” to “Lebenskunde” and “Landespflege” to “Landschaftsökologie” based on systems theories and most recently “Biodiversity”.
In general, scientific theory and practices have generated epistemic-moral hybrids by addressing at the same time natural scientific evidences and moral imperatives with regard to analysing (destructive) change in the environment and providing expertise of treating nature properly.
The question remains whether these epistemic-moral hybrids i) are specific for ecology (as a domain of biological sciences) only in the context of conservation and environmental issues, ii) show a general feature of ecology, or iii) can be extended to an even more general pattern of interdisciplinary fields of biology in the 20th century and beyond.
common sense insight: since high-level language and low-level language are just two ways of talking about the same thing, it is nonsensical to say that low-level properties are “more” causally efficacious than high- level ones. I weigh this common sense insight against the conflicting epiphenomenal intuition, and argue that the latter is due to a conflation of mereological and level-of-description distinctions. I close with the suggestion that this leaves the question of reduction vs. emergence open, since most supervenience arguments (e.g. Kim, 2005) appear to rest on just such a conflation.
Session: XII.9 Room: Peter Chalk 2.5 Time: Sunday 9:00-10:30
Jukka Tienari
Session: I.7 Room: Peter Chalk 1.6 Time: Thursday 11:30-13:00
Helsinki University Central Hospital and HUSLAB, Helsinki, Finland
An Algebraic Model For Teaching Theoretical Biology
A reduction of biological knowledge to physics and chemistry would require that it is given a mathematical form compatible with physicochemical laws. Conventional mathematical models predicting time courses in molecular biology are based on ordinary differential equations. In practice, construction of such models is often difficult because of a paradox with the massive amount of experimental data on biological systems that has accumulated. The available data often remain insufficient to run useful simulations because, in many aspects, biology is extremely data-poor – particularly in terms of quantitative, time-dependent and spatially resolved measurements. So few details about the actual in vivo processes are known that it is very difficult to proceed without numerous, and often arbitrary, assumptions about the nature of the nonlinearities and the values of the parameters governing the reactions. Therefore, it is presently impossible to construct reliable predictive mathematical models for many biological processes. The lack of reductive models creates challenges for teaching the physicochemical foundations of living processes. In the absence of general mathematical representations the physicochemical principles underlying biological processes have in textbooks sometimes been given verbal formulations. Often the problem has been circumvented by adding a paragraph introducing a few basic thermodynamic principles. Such accounts remain rather distant from the actual application of physical chemistry to biology. There are, however, alternative ways of explaining thermodynamics of living processes. We could introduce the thermodynamic principles relevant to biology by a deductive sequence explaining them as necessary outcomes of familiar laws such as mass action and basic kinetic principles. Instead of an
Samuel Thomsen
University of Pittsburgh, Pittsburgh, PA, United States
How Emergence Might Overcome Epiphenomenalism
Contemporary epiphenomenalism tends to rest on certain intuitions about causal properties, namely, that any causal properties had by “high-level” features like consciousness are really nothing but the causal properties of the “low-level” parts on which they supervene, so that the real causal story reduces from a complex causal one to simpler ones belonging to physics. A common objection to emergentism and other forms of non-reductive physicalism rests on essentially the same point: that it does not appear that high-level properties can have any causal powers not already had by their low-level instantiations. In this paper I develop a viewpoint suggested by several philosophers, from Mark Bedau to Richard Rorty, which rests on an opposed
attempt to reduce biology to universal physicochemical laws this approach would be based on a comparison of biological systems to idealized physicochemical systems arrested in steady states which are chosen in order to keep the mathematical analysis as simple as possible. Such discreet states allow us to demonstrate the logical consistency of basic biological and physicochemical principles by a deduction based on algebraic inequalities. Algebraic manipulations are often simple so that almost all the intermediate logical steps can be supplied which makes arguments constructed from them convincing as formal proofs. Such a presentation could serve as a standard introduction to theoretical biology in textbooks and in pre-graduate biology courses.
Time: Sunday 11:00-12:30
Session: III.11 Room: Peter Chalk 2.1 Time: Thursday 16:30-18:00
Session: XIII.11
Room: Peter Chalk 2.1
Systems of Functions: Functional Attribution and Functional Decomposition in Biology
Since Aristotle (HA 589a2-5) a dualism of fundamental functions is well established in biology, with nutrition and reproduction being the uppermost functions of organisms. In scholasticism this dualism was terminologically standardized by the notions of self- conservation (“conservatio sui”) and species conservation (“conservatio speciei”). Long before the establishment of biology as a distinct science, these two concepts were considered as principles, determining the systematic scope of analyses and explanations in the organic domain. Curiously, both concepts are sometimes translated into the other: organic self-maintenance is characterized as a kind of reproduction (“self- reproduction” of the body by constantly changing its parts), and vice versa reproduction is interpreted as the most efficient mode of self-maintenance. But, in a different analysis the functional perspectives guided by the two principles are in conflict with each other. From an evolutionary point of view, self-preservation is subordinated to species-preservation or reproduction because what is maximized in evolution is not individual life-span, but overall representation of a type in the future (by reproduction).
In my paper, I will discuss the status and relationship of these two fundamental principles and ask, whether they can still be taken as delimiting the analytical and explanatory scope of biology. In their basic rank they could qualify as methodological principles, excluding those aspects of organic systems from biology that can not be subordinated under them (e.g. beauty for beauty’s sake or true moral altruism).
Additionally, I will discuss several accounts of functional systems in biology, e.g. 19th century approaches to the analysis of animal behaviour that distinguish four universal functions, J. von Uekxüll’s schema of function-circles, and J.G. Miller’s decomposition of the organism into 20 “critical subsystems”.
By this approach, I will address the problem of functional language in biology not in the standard way by giving general and formal explications of the notion ‘function’, but by analyzing the system of functions that were actually attributed to organisms. Instead of explaining or translating singular instances of function statements, I will focus on the ensemble of functions that constitute an organism. Some leading questions are:
Stéphane Tirard
Université de Nantes, Nantes, France
Spontaneous generations, beginning of life and history of life in Lamarck’s theory
In 1802, in his book, Recherches sur l’organisation des corps vivans... Lamarck presented his evolutionary theory and, for the first time, introduced the notion of spontaneous generations in his work about living organisms.
We claim that, in Lamarck’s evolutionary theory, spontaneous generations are not a zoological anecdote but a very crucial argument. Spontaneous generations are the junction between non living matter and the term of animality, Monas termo. This process of transformation of matter from non living to living one is completely included in the general process of evolution and, for Lamarck, spontaneous generations are a simple, but complete, model of the mechanisms of the transformation of organisms.
Moreover, it is also very important to emphasize that Lamarck establishes an analogy between spontaneous generations and fecondation as a proof of the existence of spontaneous generations. In this way we can observe how spontaneous generations can be connected with epigenetic conception. Finally, we have to distinguish spontaneous generation as a perpetual beginning to spontaneous generation and as a primordial beginning. In Lamarck’s work, the first significance dominates the second, but by studying the possibility of the second way we can reveal the limits of history of life in Lamarck’s evolutionary theory.
REFERENCES
Corsi P., Gayon J., Gohau G., Tirard S., Lamarck, philosophe de la nature, Paris, Presses Universitaires de France, 2006.
Georg Toepfer
Humboldt-University, Berlin, Germany
Can organic functions be ordered in a system, modular, hierarchical or of another type? By which theories are the attributions of basic functions guided? Has the basic level of functional attribution changed in history?
in the associated text.
Therefore, given the pervading misconceptions about some of the most important evolutionary concepts and the importance of the phylogenetic tree to aid in “scientific alphabetization”, it is fundamental to carry out an analysis of its quality in Natural History Museums. Our goal is to develop a suitable methodology to do so and to determine whether the tree of life metaphor is really necessary in this context.
Session: VI.7 Room: Peter Chalk 1.6 Time: Friday 14:30-16:00
Erica Torrens, Ana Barahona
National University of Mexico, Mexico DF, Mexico
Is the tree of life metaphor really necessary?
Although biological evolution is a fact not a theory, and its study is compulsory in Secondary Education in México, there are still pervading misconceptions about some of the most important concepts related to it.
For more than two decades it has been known that many students and public in general have difficulties to understand those basic mechanisms responsible for evolutionary change. Some of the most significant difficulties found when teaching and learning about biological evolution are related to the simplification of concepts, causal thought, anthropocentric thought or the influence of communication means (TV, internet) and language.
On the other hand, Darwin’s great book has two main ideas in it, which are represented by the title’s two clauses: Origin of species & Natural selection. The first one is the argument holding that all living beings were originated by a single common ancestor despite their diversity of forms, habitats and general features. To Darwin, this idea could be represented by a tree, in the same way that separate twigs on a tree trace back to the same major branches. So Darwin developed the metaphor of the “tree of life” to communicate this idea and to show evolutionary relationships between species, orders, classes, etc.
Since then, evolutionary trees have become a major didactic tool in textbooks, museums, magazines, etc., as they are the most direct representation of the principle of common ancestry. However, tree diagrams are often misinterpreted by a number of reasons, mainly because it is not easy to read a tree diagram as a depiction of evolutionary relationships, instead of similarities. Also, because non specialists are prone to read trees along the tips, and from the bottom to the top assuming therefore a direction in the evolutionary process. This leads to the incorrect view that evolution is a linear progression from “less evolved” to “more evolved” species.
The correct way to read a phylogenetic tree involves training because it requires the reader to know the conventions employed in the diagram to understand the information within. These conventions are ideally found
Session: XI.4 Room: Peter Chalk 1.1 Time: Saturday 16:00-17:30
Michael Trestman
University of California, Davis, CA, United States
The Informational Bee: the Integrative Role of a Causal Concept of Information
The term ‘information’ is widely used in many areas of biological theory, from molecular genetics and developmental biology to behavioral ecology and neuroscience. I advocate a sharp distinction between its use in biological theory and its epistemic or semantic meanings in other contexts. By analyzing what it means, in purely causal terms, for a biological system to ‘react to’ or ‘use’ information, and treating the contentious and context-sensitive issue of function as conceptually separate, we may understand the theoretical locutions concerning ‘information’ used in many areas of biology in a way that makes them non- metaphorical, non-trivial, and metaphysically respectable. Moreover, an ‘informational’ perspective can serve an important integrative function between various research programs. I illustrate this point with an overview of the diverse and multifaceted literature on explicitly and implicitly information-theoretic approaches to studying colony-level behavioral organization in the honey-bee. I argue that my analysis captures the theoretically important concept of information at work here, and avoids problems that plague other accounts, such as the teleosemantic, which treats some notion of proper evolutionary function as intrinsic to the concept of information. Jablonka’s (2002) account is an attempt to fix some of the problems with earlier teleosemantic accounts, especially Maynard- Smith (2000). While her account is an improvement in several respects, I argue that it fails to overcome, and indeed highlights, problems endemic to the teleosemantic approach generally, which render it inapplicable to the concept of information as it is widely used in biological research.
Session: XII.11 Room: Peter Chalk 2.1 Time: Sunday 9:00-10:30
Second, we will utilize this exposition in order to examine the nature and structure of these models with scrupulous attention paid to: (1) the methodological motivations accompanying the acceptance of the mathematical model into a biological hypothesis; (2) the interpretation of the model both within and outside of the theoretical construct of which it is embedded, and most importantly; (3), attention will be paid to the consequences of the various interpretations assigned to these models. Both Bayesian and Agent-Based Models attempt to create broad descriptive and predictive frameworks. The authors will discuss the relationship between prediction and explanation in these models in reference to the prioritization of predictive components over explanatory components. Further, this section will discuss whether Bayesian and Agent-Based Models, interpreted instrumentally, represent adequate explanatory tools for use in the description of complex adaptive systems.
The conclusion drawn by the authors is, essentially, a negative one, but with the admission that it is far too early to provide a definitive rejection of the use of these models as an explanation of the behaviour of complex organisms. This is not to say that Agent-Based and Bayesian models are not so positioned as to be inadequate to provide accurate prediction of behavior, but simply that prediction is not explanation.
Mark D. Tschaepe
Southern Illinois University, Carbondale, United States
Gospel of Greed: Peirce’s misreading of Darwin
In Peirce’s essay, “Evolutionary Love,” Darwin’s evolutionary theory provides the basis for what Peirce calls tychism. From considerations of tychism and anancism, Peirce formulates his theory of cosmological evolution: agapasm. Given the importance Peirce accords to Darwin’s theory, analyzing Peirce’s reading of Darwin facilitates an understanding of Peirce’s evolutionary cosmology, as well as future misreadings of Darwin by American philosophers.
In this presentation, I argue that Peirce misreads and thus only offers a caricature of Darwinian evolution, and that this misreading leads to serious questions concerning agapasm. First, I provide a definition of tychism and how it provides a basis for agapasm. Second, I explain how Peirce misreads Darwin in his analysis of evolution. Third, I indicate what problems arise for Peirce’s theory of evolutionary love once this caricature of Darwin is recognized. Finally, I explain the implications for Peirce’s evolutionary theory when his reading of Darwin is corrected.
Session: XII.9 Room: Peter Chalk 2.5 Time: Sunday 9:00-10:30
Session: VII.8 Room: Peter Chalk 2.6 Time: Friday 16:30-18:00
Trin Turner2, Tom Schenk, Jr.1
Derek Turner
1) Iowa State University, Ames, Iowa, United States, 2) Western Michigan University, Kalamazoo, Michigan, United States
Connecticut College, New London, Connecticut, United States
Bayes is the New Black: Agent-Based Modeling and Bayesian Inferences in Biology
The recent use of Bayesian Inference and Agent-Based Computational Models in the biological sciences to predict the behavior of complex entity’s foraging and mating selection processes brings with it a host of methodological and philosophical questions to bear, such as the interpretation of these predictive models and their adequacy as explanatory models. The purpose of this paper is twofold. First, we wish to provide a brief expository of the most current usage of mathematical modeling in the biological sciences, with particular attention paid to models based on machine learning capabilities and the modeling of the learning processes that or may not be present in complex organisms. Specifically, we will be looking at the utilization of Bayesian Updating Modeling Frameworks and Agent- Based Computational Modeling for use in the prediction of (i) foraging behavior and (ii) mate selection procedures within a multitude of divergent environments.
The Trendiness of Paleobiology
In a series of recent papers, Andre Ariew and colleagues have elaborated and defended a statistical conception of evolutionary theory (Walsh, Lewens, and Ariew 2001; Matthen and Ariew 2002; Ariew and Lewontin 2004). According to this statistical view, which is best contrasted with the dynamical conception defended by Sober (1984), “natural selection is not a process driven by various evolutionary factors taken as forces; rather, it is a statistical “trend”...” (Ariew and Matthen 2004, p. 57). In this paper, I argue that the statistical view has some interesting consequences for our understanding of the relationship between macro- and microevolution, and for our understanding of paleobiology’s uneasy relationship with the rest of evolutionary biology.
Paleobiologists have long been interested in testing claims about macroevolutionary trends, such as body size increase (Cope’s rule). Much of the empirical work has focused on the question whether a given trend is passive or driven (McShea 1994). Empirical tests have
been devised to determine whether a trend is passive or driven. Once a trend has been identified as passive vs. driven, researchers might also ask what causes it.
Many scientists have thought that if Cope’s rule were to turn out to be a driven trend, it would be natural to suppose that the underlying cause of the directional bias is selection in favor of larger body size. It is not unusual to see speculations about why selection might favor bigger organisms: they are tougher to kill, more attractive to potential mates, more intimidating to rivals, better able to regulate their body temperatures, and so on. However, if the statistical view is correct, natural selection is not the sort of thing that can be a cause. That is because on the statistical view, natural selection is a trend, and statistical trends cannot be causes. I argue that trends cannot be causes because they are abstract entities. (Consider: the geographical center of population of the U.S. is an abstraction. A westward drift in the center of population is a statistical trend, but it is not the sort of thing that can be a cause.)
Thus, if the statistical view of natural selection is correct, we need to rethink the idea that selection can be the cause of large-scale evolutionary trends. I conclude the paper with some positive suggestions about how to think about the relationship between micro- and macroevolution, provided that the statistical view is correct.
Time: Saturday 11:00-12:30
University of the Basque Country, Donostia, Basque Country
Waddington’s Symposia.
A Retrospective Assessment
As it is known, in the late 60s Waddington organized, in the name of the International Union of Biological Sciences (IUBS), a series of meetings which end up being four consecutive symposia Towards a Theoretical Biology as their general subject. The contributions were later published, simultaneously, by Edinburgh University Press in the United Kingdom and by Aldine in the United States.
As the title makes clear, the goal was to bring together a broad group of researchers and made them reflect upon the prospects of an area of research that could build up Theoretical Biology in its more general and fundamental sense.
In this work, on the one hand, I will present the relevant information with respect to the original idea,
the organizer, the participants, the development of the meetings and their conclusions, as well as highlights of some of the works presented.
On the other hand, my aim will be to examine whether this effort might be considered, in any sense, forerunner of current systemic and complexity oriented approaches in the biological sciences (and the philosophy of biology).
In order to do the latter, I will proceed first to make a brief assessment of their impact and repercussion during those very years (late 60s and 70s) and attempt an explanation of why they were mainly disregarded. Then I will explore to what extent those symposia and/or their participants might have had any influence with respect to the current situation in biology, to find out that probably not, at least not as a direct, even if retarded, effect of the meetings.
My contention will be that neither the scientific or philosophical atmosphere of those decades were appropriate enough to push forward the perspectives advocated by Waddington and most of his contributors, but that it is more so in the present. I will count to support that on the new modeling and computing possibilities, the interest on systemic approaches, the centrality of complexity and complex systems studies, and more specific endeavors such as, for instance, post- HGP proteomics or Evo-Devo.
Accordingly, even if we admit their lack of direct influence, I will claim that, nevertheless, at least some of the persons, their ideas, modeling tools and approaches are precisely those that are contributing to the changes that are nowadays taking place in biology (and other areas).
Session: IX.7 Room: Peter Chalk 1.6
Jon Umerez
Session: IX.6 Room: Peter Chalk 1.5 Time: Saturday 11:00-12:30
Alexander v. Schwerin
Technical University of Braunschweig, Braunschweig, Germany
Isotopes and Animal Models during National Socialism
Scholarship exists that describes – albeit primarily for the United States – how the tracer technique made its appearance in the life science, especially, biochemistry, physiology and medicine. In addition, historians have one-sidedly investigated the circulation of radioisotopes in the atomic age – that is when radioisotopes started to be produced in huge quantities by the first atomic piles primarily in the United State and, soon afterwards, the United Kingdom.
That having been said, there has been relatively little work concerned with radioisotopes in Germany under
National Socialism. This may be due to the fact that the pioneers of the tracer techniques in biochemistry left Germany primarily for the United States for political reasons. Nonetheless, only recently Bernd Gausemeier and Florian Schmaltz described significant activities using radioisotopes for biological work in during National Socialism. They pointed to the combination of industrial, scientific and military interests that turned out to be crucial for the “activation” of radioisotopes in living matter. Last, but not least, in addition to biological institutes under the umbrella of the Kaiser Wilhelm Society the German Research Fund (DFG) played an active role within this policy of objects.
Session: I.9 Room: Peter Chalk 2.5 Time: Thursday 11:30-13:00
In my presentation I would like to widen the scope of the existing historiography by first giving a short overview about further activities connected to using radioisotopes as biological tracers in Germany under the swastika. My main issue will be to ask what the significance of regulatory policy and radiation detection was for the realisation of new biotechnical practices. Finally, I will draw a comparison to another biological tool, animal models, with the aim of addressing the special conditions of research under National Socialism.
Transactional Analogues: Non-semantic Representation in the Mind and Elsewhere
Few would insist that a walking stick insect avoids predators because they think it is a sign of twiggy things. Predators ignore what seems not to be prey, and such identification is a matter of perception. When a biological analogue (e.g., that walking stick) corresponds closely with other things (e.g., twiggy ones), it is tempting to say a predator “thinks” the analogue is a non-prey twiggy thing out there. Instead, the predator thinks there is a non-prey twiggy thing out there, and has no inkling of the walking stick per se.
The walking stick is a transactional rather than semantic analogue. We might say it functions as a transactional metaphor for the external correlate, given that a predator acts as though it were itself a part of the twiggy background. It’s a structural analogue for twigs, and a functional analog, since the responses of predators to its structure select for it evolutionarily.
Mental representations are the same kind of thing. Centuries of treating them as semantic representations – as signs of externalities – have left philosophers at a perplexing impasse. We can’t seem to get from the biological function of representations to the semantic function we have always presumed they are all about. The problem: we haven’t recognized them for what they are.
Mentally representing organisms typically treat their internal world of experience (a transactional analogue of parts of the external world) as being the only world, which is tantamount (for practical purposes) to treating it as the external world itself (making it effectively a metaphor of that world). But mental representation theorists have traditionally treated representations as signs. For an agent to use a sign, it needs knowledge of the sign itself, of what it represents, and of the grounding relation allowing it to represent its object.
Virtual reality is a helpful example of transactional representation. The agent/user responds strictly to presented experience, and her actions are tracked within the virtual realm, not that of bodily motion. The vivid impression of total immersion in the virtual realm (and its “transparency”) is illuminating – especially given that users uniformly know that virtual realm is not real. What of an organism experiencing an extremely vivid, internally generated virtual reality, with no knowledge there is any other domain whatsoever?
I’ll point to evidence from neuroscience (homework for
Kent Van Cleave
Indiana University, Bloomington, IN, United States
Session: XI.10 Room: Peter Chalk 2.3 Time: Saturday 16:00-17:30
Minus Van Baalen
UMR7625, Université Paris 6, Paris, France
How new units of selection may emerge in the course of evolution
Any biological organism from viruses up is composed of smaller subunits. How such organisation could arise in the course of evolution is often not clear, as such cooperative structures are typically vulnerable to `cheating:’ subunits diverting resources to their own ends at the expense of the whole. This problem has been recognised as one of the main problems in evolutionary biology since Hamilton’s seminal work showed that selection will favour individuals that pursue their own interests rather than that of the species. I will discuss recent developments in our understanding of in particular the ecological factors that favour individual subunits to forego (part of) their private interests to benefit a common good.
New mathematical techniques allow us to generate predictions as to when independently replicating units may integrate to form new units of selection. This way we obtain insight into the conditions that have favoured the major evolutionary transitions.
the audience) that the brain is a modeller, not interpreter of external signs. With thoroughly compelling “virtually real” models of a “perceived”, imagined, or remembered world, organisms have no need for semantic access to the external world.
that have to do with empirical characteristics of human attention. Partly, those characteristics are well known from journalism. I will discuss some ideas directed at the problem of how non-spectacular global issues can be made more “morally newsworthy” and get more prominence on the genomics moral agenda.
Session: VII.6 Room: Peter Chalk 1.5 Time: Friday 16:30-18:00
Cor van der Weele
Session: IV.3 Room: Newman D Time: Friday 9:30-11:00
Wageningen University, Wageningen, Netherlands
Joris Van Poucke, Philippe De Backer, Gertrudis Van de Vijver, Marcelle Holsters, Dani De Waele, Linda Van Speybroeck
Justifying the moral agenda on genomics
Which moral issues receive more than enough attention, and which deserve more attention than they get? Those are questions worth to be raised, in general but also concerning genomics, so it is to be welcomed that moral agendas on genomics are indeed increasingly reflected upon. This happens through inventories of ELSI3 research projects (in the US; ELSA in Europe, GE LS in Canada) and interviews of experts.
Such inventories and interviews reveal trends in the evolution of issues, but they do not necessarily reveal an underlying pattern in those trends. In my paper, I will call attention to a pattern by distinguishing three frames on relations between science and society that guide moral agendas (as well as the way issues on the agenda should be approached). The three frames are, respectively, ELSI (or ELSA/GE3LS), participatory democracy and global ethics. In an ELSI framework, scientific developments are the starting point. Ethics is involved with the social aspects and implications emerging from developments in science and technology. In a participatory framework, science and society are seen as “co-evolving”; ethics should see to it that this co- evolution takes good, that is democratic, forms. In a global ethics perspective, the moral agenda is set not on the basis of science but of the moral state of the world. The moral challenge is to harness science so that it contributes to solution of global injustice.
These frames have direct implications for the relevance of moral issues concerning genomics. For example, in an ELSI framework, transhumanism is a good and important issue for moral reflection, because scientific developments may enable dramatic human improvement in the future, and we should prepare for that future by starting to think about potential implications now that there is time. From a global ethics perspective, on the other hand, it is an almost perverse issue to spend time and energy on, because it does not address existing global injustices.
It is not accidental that the moral agenda concerning genomics is dominated by the ELSI frame; ELSI is a more natural moral frame than global ethics for reasons
Ghent University, Ghent, Belgium
Anti-Reductionism And Modelling In Systems Biology: Different Perspectives
Today, Systems Biology is portrayed as a very important and potentially revolutionizing field in the post-genomic era. This potential often is made in reference to a so-called anti-reductionist move, which supposedly allows Systems Biology to differentiate itself from the ‘old’ molecular biology.
However, as neither philosophy of science nor Systems Biology present a univocal meaning of the concept of reductionism (nor of corresponding concepts such as holism and emergence) further analysis is needed to explore to what extent Systems Biology is really innovating.
In this presentation, several views on (anti- )reductionism as present in the literature on systems will be discussed. Based on the differences in these views, at least three interpretations of ‘Systems Biology’ can be characterized: (i) ‘Molecular Systems Biology’ uses a concept of anti-reductionism to draw an opposition between large-scale (‘-omics’) approaches of systems biology and pathway/component centred biology, (ii) the ‘Engineer/design’ perspective links anti- reductionism to the level of research, i.e. it aims to focus on those levels governed by system and design principles, and (iii) in ‘Complex Systems Biology’, anti- reductionism is expressed by the adage ‘the whole is more than the sum of its parts’.
Next, the question arises in how far these perspectives are different not just with regard to the kind of anti- reductionism adhered to, but also with regard to the modelling process and the kind of knowledge aimed at. An attempt is made to answer this question.
Session: VIII.7 Room: Peter Chalk 1.6 Time: Saturday 9:00-10:30
Session: XIII.4 Room: Peter Chalk 1.1 Time: Sunday 11:00-12:30
John van Wyhe
Charissa Varma
University of Cambridge, Cambridge, United Kingdom
University of Toronto, Toronto, Ontario, Canada
‘Darwin’s delay’: Another historiographical myth?
For many years it has been almost universally accepted that Charles Darwin postponed publishing his theory of evolution for many years. The dominant explanations have used fear as the primary factor. Yet it can not only be shown that the advent of ‘Darwin’s delay’ in the historiography is comparatively recent (mid twentieth century) but that the primary historical data contains no evidence for it. This paper will demonstrate how one of the most widespread beliefs about Darwin is wrong, and, finally, provide a solid, evidence based, explanation for publishing the Origin of species as late as 1859.
It’s Not In Your Genes But The Company You Keep: Phenotype, A View From The Bench
Phenotype is generally considered a trait or physical characteristic that is easy to observe. In contrast, assessing a genotype requires genetic sequencing, breeding experiments, or human pedigrees. However, many students of science are now taught to study phenotype as a consequence of genotype, thus equating a trait seen with the eye to an unseen gene.
Axiomatizing the Tree of Life:
The Impact of Logic on Biological Taxonomy in the Early Twentieth Century
This paper explores the complex relationship between logic and taxonomy in Britain and America, from the 1930s to the 1950s—a period of upheaval that transformed both disciplines. For logicians, this period marks the beginning of the shift from a long-standing tradition of Aristotelian syllogistic logic that analysed propositions in terms of grammatical subjects and predicates, to the new algebraic logic that analysed propositions in terms of the relationship existing between two classes, to the early uses of the language of Principa Mathematica to axiomatize biology. For naturalists, this period marks the beginning of the shift from an understanding of the relationship between groups in terms of morphological affinities to an understanding of the relationship between groups in terms of phylogeny; an understanding ushered in by Charles Darwin’s theory of evolution by natural selection. Looking at the work of J. H. Woodger and its impact on J. R. Gregg, this paper will critically examine how two common assumptions about these revolutions—one that shaped the narrative of the standard history of logic and one that shaped the narrative of the standard history of taxonomy—were used to connect the history of logic to the history of taxonomy and how the relationship between logic and taxonomy helped set the stage and fuel the dialogue of the heated debate between the evolutionary taxonomists and numerical taxonomists by the end this period.
Time: Thursday 16:30-18:00
Naturalizing Selection: Ronald A. Fisher and the Rothamsted Experimental Station, 1919-1933
To both those who love it and those who hate it, neo- Darwinism has meant the synthesis of Darwinian selection and Mendelian inheritance. Consequently, historians have largely approached neo-Darwinism by examining the long-term relationship between ideas about heredity and models of evolution, seeing the theoretical work of population geneticists during the 1920s and 1930s as particularly important for reconciling these two separate areas of biological
Session: II.9 Room: Peter Chalk 2.5 Time: Thursday 14:30-16:00
Laura Vandenberg,
Carlos Sonnenschein, Ana Soto Tufts University, Boston, MA, United States
My studies in the laboratory are exploring the effects of bisphenol-A, a xenoestrogen found in plastic consumer products, on the development of the mouse mammary gland. In an examination of the fetal mouse mammary gland, I found that the relative position of each fetus in the uterus with respect to its male and female neighbors affected the mammary gland phenotype, i.e. mammary glands from females positioned in the uterus between two males were more developed than mammary glands positioned between two females.
Session: III.3
Room: Newman D
The results of my studies, and the examination of many others in the literature, have allowed me to ruminate on the concept of phenotype, not as a result of the unfolding of a genetic program, but resulting from the interaction of multiple resources, including environmental factors that have not previously been considered to be significant. As a student of science, these postulations have implications for how experiments are designed and how data should be examined and interpreted.
Theodore Varno
University of California, Berkeley, CA, United States
thought. This approach, however, overlooks a crucial point: Mendelism reemerged in the first decades of the twentieth century in an agricultural and eugenic context that placed a premium on understanding how selective breeding might be used to alter the hereditary character of populations. Artificial selection was always inherently a component of Mendelian experimentation. What was novel, then, about theoretical population genetics was not the way in which it merged Mendelism and selection, but rather how it transformed a program devoted to artificial selection into a defense of natural selection; from this historical perspective, neo- Darwinism naturalized a body of knowledge devoted to the manipulation of the genetic material of domestic organisms.
How and why did this transformation take place? In order to explore this question, this paper examines Ronald A. Fisher, an architect of neo-Darwinism, during the years he spent at the Rothamsted Experimental Station, an institution devoted to scientific agriculture. From 1919 to 1933, Fisher was simultaneously an agriculturalist and an evolutionary theorist; he analyzed crop variation and experimented with breeding populations of poultry and mice at the same time that he composed The Genetical Theory of Natural Selection. The paper will provide both a broad survey of the scientific research being conducted at the Rothamsted Experimental Station in the 1920s and a more focused look at Fisher’s research during the period. It will conclude by suggesting how the naturalization of artificial selection characteristic of early population genetics might relate to some of Fisher’s other concerns, including his thoughts on eugenics.
developments in evolutionary biology. I have two aims in my talk. First, to show that selection theory is a valuable intellectual endeavour even though it will eventually turn out that life and science are fundamentally disanalogous processes. Secondly, to re- address the agenda of evolutionary epistemology on a less sociological and more biological-oriented path.
In the scientific context, by “Lamarckian challenge” I refer to the view that intellectual variation is not “blindly” generated but somehow “directed” to the solution of the problem faced by the scientist. Hull and Campbell took as fundamental the rejection of this challenge: cultural and scientific evolution are as Darwinian as biological evolution. If they turn out to be wrong then selection theory faces a significant challenge. Things are not so clear-cut of course, as recently the Lamarckian challenge has been revived by Jablonka and Lamb also in the strictly biological context. Jablonka and Lamb have provided a sound analysis of the various ways in which biological and cultural variants are generated (e.g., random, directed and biased). However, I believe that Jablonka and Lamb are too provocative in labelling “biased” variation as Lamarckian, while I suspect they are wrong in arguing that genuine Lamarckian variation exists. In the talk I will argue that their views are compatible with Campbell’s Darwinian notion of vicarious selection. I will also ponder whether, in the light of Jablonka’s and Lamb’s research, it makes any more sense to consider the Lamarckian challenge a genuine challenge for evolutionary epistemology, given that the disanalogy between life and science seems to disappear, moreover not in the direction orthodox Darwinians would favour.
If a sound analogy exists between biological and scientific evolution then evolutionary epistemology has to refer to the rich repertoire of processes that affect biological evolution. In biology non-selective processes play a central explanatory role. Drift is one of them. Hull highlighted the fact that science is organised in demes, while Sewall-Wright considered drift as an important process in the exploratory phase of his shifting balance theory. The two ideas seem to meet naturally. In the second part of this talk I will explore the heretical idea that drift plays a role in scientific evolution. I will consider under what circumstances scientific drift might affect scientific evolution, and whether the highly cherished cumulativity of scientific knowledge is affected. Finally, I will consider whether ascribing a causal role to scientific drift poses a problem for selection theory.
Session: XIII.10
Davide Vecchi
Room: Peter Chalk 2.3
Time: Sunday 11:00-12:30
Two Challenges For Evolutionary Epistemologies Based On Selection Theory
Some people (the localists) believe that every selective regime is different and highly dependent on contextual peculiarities, while others (the generalists) argue that there exist certain features of evolutionary change that are trans-contextual. In this talk I endorse the generalist line of reasoning - the view that there are generalisations about selection and evolution, and that there exist interesting structural similarities between life and science. From this perspective I will explore two important foundational issues in evolutionary epistemology, the Lamarckian challenge and the causal role of drift, by considering recent and less recent
Konrad Lorenz Institute, Altenberg, Austria
Session: VII.8 Room: Peter Chalk 2.6 Time: Friday 16:30-18:00
Session: I.5 Room: Peter Chalk 1.3 Time: Thursday 11:30-13:00
Joel Velasco
Francisco Vergara-Silva,1 Carlos López-Beltrán2,
University of Wisconsin-Madison, Madison, United States
Fabrizzio McManus2
Prior Probabilities In Phylogenetic Inference
Given that modern computing algorithms and computing power allow the calculation of posterior probabilities on phylogenetic trees with a high degree of accuracy, the main objection to the use of Bayesian methods in phylogenetics is the belief that the posterior probability is not a good optimality criterion for trees. The theoretical problem mentioned most often in the literature is “the problem of the priors” - how to assign prior probabilities to various tree hypotheses. In the rare case that a defense is attempted, responses are typically one of two kinds: 1) The particular priors that are used are unimportant because they have negligible effect on the posteriors or 2) Uninformative priors can be used in order to ensure the posteriors are not biased in any way. In this paper I show that the priors used can affect the results dramatically and that the so- called “uninformative” priors typically used do in fact bias results significantly and so neither response is sufficient to solve “the problem of the priors.” I then go on to consider different methods of assigning prior probabilities and argue that in the general case, priors must be derived from a model of how outcomes are produced. In the particular case, this means that priors need to be derived from an understanding of how distinct taxa have evolved. The appropriate evolutionary model is that of lineage splitting, which is abstractly captured by the Yule birth-death process. This process leads to a well-known statistical distribution over trees. Though further modifications may be necessary to model more complex aspects of our knowledge, they must be modifications to parameters in an underlying Yule model. In addition, I propose a natural parameter addition to the Yule model which allows us to test the robustness of the posterior probabilities of the resulting trees and clades by simply adjusting the value of this parameter.
1) Instituto de Biología, Universidad Nacional Autónoma de México, Mexico City, Mexico, 2) Instituto de Investigaciones Filosóficas, Universidad Nacional Autónoma de México, Mexico City, Mexico
The Mexican Institute for Genomic Medicine (INMEGEN) and the Invention of the Mexican ‘Mestizo’ Genome
We propose to critically analyze some recent developments within the Mexican Health System that have led to the foundation of a brand new Mexican Institute for Genomic Medicine (INMEGEN, July 2005). The central project of this national institute is the construction of a local haplotype map of single nucleotide polymorphisms (SNPs) -inspired by the activities of the International HapMap Consortium (IHMC; Nature, October 2005)- that would in turn constitute the basis for establishing a ‘platform for genomic medicine in Mexico’. An important tenet behind the rationale of INMEGEN’s project is that filling in the currently existing gaps in the human HapMap, partially caused by the omission of Latin-American populations, will provide indispensable support for ongoing and future local research around genetically conditioned diseases.
We argue that the way this ‘Mexican Genome Project’ has been presented to divergent audiences, along with certain aspects of implementation, could puzzle interested parties coming from both the scientific and philosophical communities. The notion that to a given political space (a nation: Mexico) corresponds a given population with a given biological (genomic) structure, and that the State’s National Health Institutions have the responsibility to uncover that (self-) knowledge, protect it and use it to cater for the Health and Good Life of the very same population, is paradoxical and could generate many conceptual, biological, ethical and political conundrums, especially when taking into account the crucial support provided by multinational firms in the infrastructure setting and analysis of the data produced by INMEGEN. In a more specific historical/philosophical context, we consider a salient problem in INMEGEN’s project to be the rhetorically deliberate ambiguity –apparently exploited to infuse the project with a nationalistic feel- derived from a questionable strategy to link new empirical results with separate investigations of genomic structure and variation at the populational scale (Human Genome Diversity/Genographic Project-related), when the main declared scientific justifications are instead connected to
predominantly causal/mechanistic (as opposed to historical/genealogical) research. Without using their own results to clarify such a central issue, INMEGEN has coarsely employed the biologically suspicious label ‘Mestizo’ in all introductory sections of its proposals. One among several insufficiently analyzed race-based categorizations, Mestizo has traditionally been the racial call to arms for Independent, Modern, Nationalistic, Raza de Bronce, anti-Imperialist Mexicans. Leaving aside the rich historical, demographical and anthropological knowledge concerning the variegated origins and conformation of the human populations that nowadays live within the Mexican territory, the declared basis of INMEGEN’s sampling strategy has been the crude mythological notion that, apart from a small percentage of “pure line” Europeans and Indians, the rest of Mexicans are Mestizos, with different, regionally stable, and objectively ascertainable proportions of White, Indian and African ancestries. In our view, the use of such ideology-charged notions has been helpful to successfully finance INMEGEN’s projects out of state budgets, but in the end might not contribute to scientific insight into the basis of infraspecific human classification in this area of the world. After further discussing classificatory/genealogical issues in relation to the International and Mexican/Mestizo HapMap projects, we will claim that, given the current sampling design, “the genealogical might affect the causal” – i. e. conclusions regarding the allocation of the etiology of disease in ‘Mexicans’ might end up being flawed. Finally, we will address how our case study could enrich discussion on philosophical issues associated to the current global ‘genomics, race/ethnicity and biomedicine’ debate.
Hugo Viciana1,2, Hugo Mercier3
capacities of different taxa. We first offer a glimpse into the different claims of anthropocentric convergence in the current ethological literature on the evolution of cognition. In doing this, we show how apes, monkeys, corvids, cetaceans, children and other animals are currently researched in-depth by ethologists for their mental capacities under the light of an adaptationist program on convergent evolution. Nevertheless, this exciting research poses some philosophical problems old and new, which we enumerate. One of these is the question of the realism of the different categories of “convergent cognitive tool kits” (Emery & Clayton, 2004), and thus to know whether they are only heuristic devices or fully-fledged ontological statements. From the common assumption that convergent evolution is a process that takes place on many levels (also in cognitive evolution!), we then proceed to question whether the different ethologists actually mean the same while making a claim for convergent cognitive evolution. When are selection pressures and phenotypic cognitive traits really equivalent? We argue that the answer becomes specially tricky when in the description of the trait or of the selection pressure a cognitive-representational term is included. From some simple distinctions between function, mechanism and structure, we raise the suspicion that the chosen level of description may not always be doing the right work, as when different selection pressures may have resulted in what is, under just some arbitrary descriptions, similar cognitive capacities (and thus implying ‘’chance’’) or when what is described as an appalling cognitive convergence is rather the result of a significant parallelism, a constraint in the form of a “Corinthian column”, to use Gould’s words. In this presentation we try to provide criteria for avoiding these obstacles, which is also an epistemological beginning of an answer to the question of whether there can be a realist interpretation for the convergent minds.
Session: VII.7 Room: Peter Chalk 1.6 Time: Friday 16:30-18:00
1) Institut d’Histoire et Philosophie des sciences et des techniques, Paris, France, 2) Evocog - University of Balearic Islands, Palma, Spain, 3) Institut Jean Nicod, Paris, France
Alicia Villela
Debates in Reproductive Technologies: Semen Banks and Artificial Insemination in USA
Between the years 1938 and 1945, a number of scientists observed that sperm could survive freezing and storage temperatures as low as minus 321 degrees Fahrenheit. But surviving is one thing; being able to successfully function in the conception process is another. It was not until the 1950’s of the twenty century that two British scientists developed a method of using a syrup substance known as glycerol to protect semen from injury during freezing. The process was further refined in 1953 by Dr. Jerome K. Sherman, an American pioneer in sperm freezing.
Convergent Minds? Examining Some Current Assumptions in the Study of Comparative Social Cognition of Apes, Crows, Dogs, Children and Other Animals
When studying comparative social cognition in the anthropocentric approach (in the sense of Sara Shettleworth), it is common to look for homologies or homoplasies. In this context a very interesting case of homoplasy is convergence: a strong case has been made lately for its occurrence in the so called high-level complex
Session: X.3 Room: Newman D Time: Saturday 14:00-15:30
UNAM-Biology Department, Distrito Federal, Mexico
Sherman introduced a simple method of preserving human sperm using glycerol, he combined this with a slow cooling of sperm, and storage with solid carbon dioxide as a refrigerant. Sherman also demonstrated for the first time that frozen sperm, when thawed, were able to fertilize an egg and induce its normal development.
As a result of this research, the first successful human pregnancy with frozen spermatozoa was reported in 1953. (Shortly before the Cook County Supreme Court ruled DI was “contrary to public policy and good morals.”) Considering the hostile climate for DI at the time, it is not surprising that nearly a decade passes before the first public announcement of a successful birth from frozen sperm.
Artificial insemination and the use of semen banks has become now a central treatment among fertility patients.
In what follows, I examine the development of semen banks in USA for eugenics purposes.
than y; x is fitter than y. Only if trait fitness is defined in this way will variations in fitness correctly predict and explain changes in population structure attributable to selection.
One consequence of including variance as a component of fitness is that fitness describes the way changes in population structure are systematically affected by population size. This poses a dilemma for the dynamical interpretation. On one horn, if selection and drift are causes of population change and are identified (respectively) with the independently manipulable parameters of fitness difference and population size, as the dynamical interpretation proposes, they cannot be discrete, independent causes; drift must be considered a component of selection. On the other horn, if drift and selection are not these independently manipulable parameters, the best argument for supposing that they are causes of population change is undercut.
I argue that, given Gillespie’s definition of fitness, only the statistical interpretation of the modern synthesis can preserve the explanatory discreteness of selection and drift.
Time: Friday 9:30-11:00
Session: XIII.5 Room: Peter Chalk 1.3 Time: Sunday 11:00-12:30
Denis Walsh
University of Toronto, Toronto, Ontario, Canada
Session: IV.5
Room: Peter Chalk 1.3
C. Kenneth Waters
Fitness, Discreteness and Compositionality
Minnesota Center for Philosophy of Science, University of Minnesota, Minneapolis, MN, United States
Getting Real about Genetics and Genomics: An Anti-realist Perspective
Inflated accounts of knowledge in genetics and genomics are reinforced by the epistemological idea that successful research is organized by comprehensive theoretical frameworks that identify fundamental entities and processes. According to this epistemology, the success (or failure) of genetics and genomics depends on a comprehensive, theoretical framework that identifies the fundamentals of heredity and development. In this paper, I advance a deflationary epistemology for understanding genetics and genomics. Research in these sciences, I contend, is organized around investigative strategies involving the manipulation of a broad range of biological processes; it is not structured by comprehensive theorizing about the fundamentals of information, genetic programs, or developmental systems. I distinguish comprehensive theorizing about alleged fundamentals from local theorizing about causes in situated processes. I argue that causal knowledge about situated processes provides all the knowledge scientists need to manipulate and investigate processes underlying heredity and development. There is indeed compelling evidence to believe these local claims. Claims about fundamentals
The dynamical interpretation of the modern synthesis theory of evolution holds that natural selection and drift explanations articulate the causes of population change. I argue that the metaphysical commitments of this interpretation are inconsistent with the structure of modern synthesis explanations. This inconsistency is exposed by Gillespie’s seminal work on the definition
of fitness.
Drift and selection are explanatorily discrete in that selection explains those aspects of population change that drift cannot (and vice versa). The dynamical interpretation accounts for this discreteness by hypostasizing selection and drift as discrete, composable, proprietary causes of population change. Selection is identified as that process caused by variation in trait fitnesses. Drift is identified as that process caused by sampling error due to population size. The best argument for selection and drift being discrete, independent causes is that one can manipulate fitness variation and population size independently.
Gillespie’s work on fitness demonstrates that trait fitness must be defined as the mean and variance of individual fitnesses. The inclusion of variance as a component of trait fitness is crucial. If traits x and y have the same mean individual fitnesses but the variance of x is lower than that of y, then in a finite population x will have a higher expected growth rate
play a different role; they create excitement that helps scientists recruit workers and secure financial resources.
Time: Saturday 9:00-10:30
‘reading’ databases and archives, so that we can get a grasp not only of the complex and emergent processes through which electronic texts and databases are made, but also of the ways in which databases, beyond their immediate performative effects (in making memory of biodiversity, cultural diversity and so on) implicitly project and perform ideas of the human subject in relation to science, nature and society.
Session: VIII.8
Room: Peter Chalk 2.6
Claire Waterton
Centre for the Study of Environmental Change, Lancaster University, Lancaster, United Kingdom
Experimenting with the archive:
performance and emergence in the making of databases of nature
This paper is about recent attempts by scholars, database practitioners and curators to experiment in theoretically interesting ways with the conceptual design and the building of databases, archives and other information systems. The paper suggests that we are currently witnessing a time where close convergences are occurring between social theory and database/archive construction. The paper identifies a ‘move’ towards exposure of the guts of our archives and databases, towards exposing the contingencies, the framing, the reflexivity and the politics embedded within them. Examples will include biodiversity databases, the construction of indigenous people’s databases in Australia, and the building of new spaces for the natural history collections at London’s Natural History Museum. On one hand, in celebrating performance and emergence within the idea of the database, perhaps such moves belong to a new cultural age, characterised itself as a complex system in which the key dynamic (even of scientific facts or data) is recursivity, and coproduction and coevolution of multiple causalities (Hayles 2005: 28). This cultural age is as yet only partially dawning, and even more partially understood. On the other hand, might these daring experimental moves in performativity and emergence within data and databases, be yanked ‘backwards’ into a more conventional understanding of the archive – one in which it is expected that the database or archive has of necessity both to reveal and to hide, that it has to incorporate its own tacit politics, and that it can do nothing other than harbour its own representational and mimetic histories and powers? The fact that individuals, research teams and institutions are experimenting with digital and other material and political forms of database/archive forces us to imagine both of these possible routes ahead. Both routes present difficulties, the first because it challenges our ways of thinking to unknown limits (we do not know where we are heading), and the second because we are still less- than-good at rich, hermeneutic readings of our contemporary cultural artefacts. Either way, this paper argues that we need new, culturally deeper ways of
Session: VIII.6 Room: Peter Chalk 1.5 Time: Saturday 9:00-10:30
Elizabeth Watkins
University of California, San Francisco, San Francisco, CA, United States
The Medicalization of Male Menopause in America
The topic of male menopause occupied space on the medical radar screen from the late 1930s through the mid-1950s, then virtually disappeared for the next four decades, until the late 1990s. By contrast, articles on this subject appeared in American popular magazines and newspapers at a consistent, if low-level, rate throughout the same period. This essay describes how male menopause became medicalized not by the driving forces of academic researchers and influential clinicians, but instead by a model perpetuated by laypeople and medical popularizers. A medicalized conceptualization of the body and the life cycle had become widespread by the second half of the twentieth century, as Americans grew accustomed to regarding their lives through the lens of medicine. People came to expect medicine to provide a cure for any ailment; in the wake of the development of the so-called wonder drugs, no affliction seemed beyond medical and pharmaceutical intervention. A medicalized model had also been effectively produced for understanding and treating the menopause in women; a parallel, if not identical, stage in the life course of men seemed reasonable. This framework, rather than persuasive evidence from the research laboratory or clinic, helped to medicalize male menopause and provided the basis for its eventual pharmaceuticalization at the end of the twentieth century.
Time: Friday 16:30-18:00
Session: VII.11
Room: Peter Chalk 2.1
Richard A. Watson
School of Electronics and Computer Science, University of Southampton, Southampton, United Kingdom
Compositional Evolution and symbiosis
Darwin’s masterful contribution was to provide an algorithmic model (a formal step-by-step procedure) of
how adaptation may take place in biological systems. However, the simple process of linear incremental improvement that he described is only one algorithmic possibility, and certain biological phenomena provide the possibility of implementing alternative processes. I show that certain mechanisms of genetic variation (such as sex, gene transfer, and symbiosis), allowing the combination of preadapted genetic material, enable an evolutionary process that is algorithmically distinct from the Darwinian gradualist framework. The differences between ‘compositional evolution’ and gradual evolution derive from the fact that they have different underlying algorithmic principles. The algorithmic principle of gradual evolution is simply ‘hill climbing’, i.e. linear incremental improvement, whereas the algorithmic principle underlying compositional evolution is divide-and-conquer problem decomposition. The method of solving a problem by decomposing it into more manageable subproblems is a familiar and intuitive concept in design and engineering. But whereas this is usually assumed to require top-down knowledge of how to decompose a problem, I show that it can be applied ‘bottom up’ within an evolutionary process. When this algorithmic distinction is understood, we see that it is no longer appropriate to try to force mechanisms like sex, lateral gene transfer, and symbiogenesis into the linear paradigm of the gradualist framework. Instead we must expand the framework of evolution by natural selection to include a greater range of algorithmic possibilities. I suggest that both gradual evolution and compositional evolution be included in evolution by natural selection - thus breaking the equivalence between evolution and gradualism that is previously assumed. Using evolutionary computation models I show that compositional evolution is capable of evolving certain kinds of complex systems, specifically, systems with modular interdependency, that would be considered unevolvable under the gradualist framework. Accordingly, our conceptions of what is difficult or easy to evolve need to be revised. In particular, I show that a system that has all the properties usually associated with evolutionary difficulty - in fact, a system that is pathologically difficult for gradual evolution - can nonetheless be easy to evolve with compositional mechanisms. A system can have an exponential number of local optima, wide fitness saddles, high-fitness points that are irreducibly complex, and accordingly the global optimum may have no accessible path of small changes that is monotonically increasing in fitness. Although gradual evolution cannot be guaranteed to find the global optimum in time less than exponential in the problem size in this pathological case (i.e. no better than random guessing), compositional evolution can find the global optimum with probability close to one in time only polynomial in the problem size. Accordingly, Darwin’s famous assertion needs revision. Even if it
could be demonstrated that a complex adaptation existed which could not possibly have been formed by numerous, successive, slight modifications, (or even random modifications of any size), his theory of evolution by natural selection would not break down - but gradualism does.
Session: VII.2 Room: Newman C Time: Friday 16:30-18:00
Marcel Weber
University of Basel, Basel, Switzerland
Causes Without Mechanisms:
The Hodgkin-Huxley Model Revisited
Jim Bogen and Carl Craver have argued that the Hodgkin-Huxley (1952) model of the action potential, while serving many important epistemic roles, was not explanatory. On their view, to explain a phenomenon means to describe a mechanism that produces this phenomenon. Hodgkin and Huxley, by their own admission, did not know the mechanism of membrane permeability changes, therefore, they did not provide an explanation of action potentials (Craver, this symposium). At the same time, both of these authors subscribe to Jim Woodward’s account of causation. In this paper, I show that they cannot have both. If they want to hold to their view in regard of the Hodgkin- Huxley model, they ought to view Woodward’s analysis as being too permissive in identifying causes. For, as I will show, the Hodgkin-Huxley equations (in their classical form of 1952) satisfy all the requirements of counterfactual-supporting, invariant generalization in Woodward’s sense. If this account is accepted, causal explanations without knowledge of underlying mechanisms are possible. The paper concludes by drawing some comparisons to other biological disciplines such as genetics, where there are also historical examples of causal explanations that are silent or neutral with respect to the mechanistic realisers of causes.
Session: X.5 Room: Peter Chalk 1.3 Time: Saturday 14:00-15:30
Michael Weisberg
University of Pennsylvania, Philadelphia, PA, United States
Simplicity and Generality in Biological Modeling
This paper considers modeler’s much touted preference for simple models. Many reasons have been offered for this preference including the special role of simple models in highlighting difference making causes, the explanatory power of simple models, and, more commonly, that simple models are more general. So
common is the claim that simple models are more general and preferable if at all plausible, that in a recent review article, Olivia Judson criticized theorists building individual based ecological models because given their complexity, they are not very general. Despite the commonly held belief that simple models are more general, I will argue that, all things being equal, simpler models are less general than more complex models. In fact, what drives the intuition that simpler models are more general is not that they intrinsically apply to more targets, but rather that they are usually evaluated with lower standards of fidelity. This evaluation is typical, but entirely optional and an objective comparison of more and less complex models for the same target using the same standard of fidelity will show the complex model to be more general. After diagnosing the origin of the perceived generality of simple models, I will consider the special place of simple models in the theoretical enterprise of population biology.
Time: Thursday 14:30-16:00
elements are the structures that support certain communicative transactions between these subsystems. The producer encodes information into the structures in question, the consumer decodes information from them. In the case of protein synthesis, the producer subsystem is constituted by the mechanisms underlying transcription plus, in eukaryotes, RNA splicing, while the consumer subsystem is constituted by the distributed mechanism of ribosomes and tRNA that realizes the process of translation. The coding vehicles are the mRNA molecules produced during the first of these processes and consumed during the second. Given that genes are no longer the locus of developmental coding, this account is not in the business of meeting WUC as stated above. However, it plausibly meets an analogous constraint for mRNA. In a final twist I consider a further argument that, if correct, establishes that while mRNA base triplets code for proteins, they don’t code for traits. If this is right, then developmental coding talk is limited at both ends. It doesn’t stretch as far back as genes, and it doesn’t stretch as far forward as phenotypic traits.
Session: II.9 Room: Peter Chalk 2.5
Michael Wheeler
University of Stirling, Stirling, United Kingdom
Session: III.4 Room: Peter Chalk 1.1 Time: Thursday 16:30-18:00
Thomas Wieland
What Codes For What In Development?
My target here is the widely held claim that genes code for phenotypic traits during development. I argue that any plausible defence of this claim must meet (what I call) the weakened uniqueness constraint (WUC). WUC states that any successful account of genetic coding must have the consequence that those non-genetic elements for which it would be unreasonable, extravagant, or explanatorily inefficacious to claim that their contribution to development is one of coding for phenotypic traits do not count as making such a contribution. Meeting WUC is a necessary condition for success here because one central thing that coding talk about genes is supposed to do is to help us make good on the thought that genes are privileged causal elements in development. I go on to argue that several initially attractive accounts of genetic coding (including those that appeal to selection and inheritance) fail to meet WUC. I then lay out an alternative account of coding talk in development, according to which the presence, in some system, of (i) an appropriate species of causal co- variation, (ii) arbitrariness, and (iii) homuncularity is sufficient to secure the explanatory credentials of such talk. Crucially, the system underlying protein synthesis meets these conditions. I proceed to argue, however, that this account has a radical implication, namely that mRNA, not DNA, is the locus of coding during development. This implication arises because, on the view developed here, coding talk requires a producer subsystem and a consumer subsystem. The coding
Munich Center for the History of Science and Technology, Munich, Germany
Coping with the ‘Hoechst Shock’: Perceptions and Cultures of Molecular Biology in Germany
In 1981, the German chemical/pharmaceutical company Hoechst signed an agreement with the Massachusetts General Hospital (MGH). The deal committed Hoechst to spending $70 million over ten years for the support of a newly built department of molecular biology at Boston’s renowned research hospital. MGH, in return, was obliged to train the company’s scientists in molecular biology and to grant licenses for patents that grew out of Hoechst sponsored research.
Calling the quality of German molecular biology and the effectiveness of long-standing government policies towards biotechnology into question, the well publicised agreement came as a shock to scientists, policy makers, and the informed public. Hence, the so-called ‘Hoechst shock’ gave rise to an intensive debate on the status of molecular biology research in Germany and on suitable strategies to improve it.
Placing this debate in a wider historical context, the paper examines changing perceptions and cultures of molecular biology in Germany. In doing so, the paper may contribute an answer to the question why Germany, despite of early government initiatives for biotechnology, was a late starter in the ‘new biotechnology’ arena.
Session: XII.4 Room: Peter Chalk 1.1 Time: Sunday 9:00-10:30
In my paper, I look at two ecological studies (Thrush et al., 2000; Paine and Levin, 1981) that illustrate the challenges involved in connecting experiment and theory in ecology. In particular, I focus on two issues: the difficulty of developing and applying a mathematical model to a natural system and the problem of determining the degree to which experimental results can be generalized. I conclude with some comments regarding the relationships among theories, models, and experiments in ecology.
John Wilkins
University of Queensland, Brisbane, Australia
The unseasonable lateness of Being-What-It-Is, or, the myth of biological essentialism
The received view of the history of the species concept
is that before Darwin, naturalists held to a view of essentialism, according to which species were constituted by necessary and sufficient traits. I will argue that this is a misunderstanding based on a conflation of the Aristotelian logical and metaphysical tradition of the essence of predicates, with the use of the term “species” and the Greek term “eidos” in natural history, and as vernacular terms. Instead, I will attempt to show that taxonomists (including Darwin) held to a diagnostic or taxonomic essentialism, but that nobody before
Session: XI.5 Room: Peter Chalk 1.3 Time: Saturday 16:00-17:30
Darwin, with a possible exception in Grew, argued that a species had a material or causal essence, and that the notion of biological essentialism arose after Darwin, possibly as a reaction to Haeckelian evolutionary ideas in French and German speaking countries, based on the revival of Thomism in Catholic intellectual circles after 1871, between the 1890s and the 1920s. The myth
Modularity, Memes, and Scaffolding in Cultural Evolution
I investigate the conditions under which modularity can emerge in cultural evolution. Material culture and mass production show significant and multi-dimensional scaffolding for the production of a diverse array of general and special function parts, for the standardized training of agents who can serve specialized roles in a highly differentiated culture, and for apparently modular bits of information produced, acquired and transferred between these agents. Although superficially suggestive of memes the multiple copies produced in such situations are dependent on the production of specialized scaffolding and not plausibly regarded as self-replicating. The interesting processes involve the production, development and maintenance of the scaffolding and not usually the reproduction of the apparently memetic elements. Their mode of production shares no interesting features with genetic replication, but perhaps more with the production of proteins.
Time: Thursday 16:30-18:00
of essentialism appears to be formulated around the centenary of the Origin and after, based perhaps on the early experiences of Mayr as an undergraduate. Further, there is a lack of connection between species fixism and essentialist metaphysics. No appeal to Aristotelian metaphysics, or Thomism in any of its forms, is the foundation for species fixism.
Brad Wilson
Slippery Rock University, Slippery Rock, PA, United States
Bridging the Gap Between Theory and Experiment in Ecology
Despite advances in both areas (and attempts to bring them together; see Kareiva, 1989; Morin, 1998; Turchin, 2003), theoretical (mathematical) ecology and experimental ecology have remained somewhat distant from one another. One of the reasons for this is that the experimental and natural systems that ecologists study are highly complex and resist easy representation in mathematical models. However, unless a link can be forged between theory and experiment, the theoretical value of experiments (where ‘theoretical value’ is understood in terms of the development of general theories or models that can be extended to other systems) will be limited.
Session: III.7
Room: Peter Chalk 1.6
William Wimsatt
University of Chicago, Chicago, IL, United States
Session: VI.3 Room: Newman D Time: Friday 14:30-16:00
Rasmus Winther
Universidad Nacional Autonoma de Mexico, Mexico City, Mexico
Mechanisms, History and Parts in Compositional Biology
Standard philosophical accounts of natural science take formal mathematical laws and models as fundamental to scientific explanation and generalization. According to these law-based interpretations of science, the laws of physics and the mathematical models of population genetics are paradigmatic cases of explanatory generalizations. Despite their validity for many
scientific domains, these philosophical accounts do not provide an adequate understanding of biology taken as a natural science and considered in its entirety. For instance, key biological disciplines such as physiology and systematics employ, respectively, mechanistic and historical explanations, not mathematical laws and formalisms. Indeed, though these kinds of explanations predominate in the research occurring in biology departments, they have received practically no attention in general philosophy and philosophy of science, and only some consideration in the philosophy of biology.
In this talk, I attempt to address this philosophical gap by first defining the general family of “part-based” explanations (as opposed to law-based), which includes mechanistic and historical explanations. Part-based explanations individuate diverse components of complex systems and explain their properties and relations, such as their mechanistic interactions and historical origins, by using non-mathematical generalizations, such as functional and narrative ones. Examples include elucidating the mechanisms of cellular respiration and the origin of the panda’s thumb.
dominant. The dispute rekindled in the 80s when naturalistic philosophers such as Karen Neander and Ruth Millikan developed a teleological notion of function based on Darwin’s theory. In 1987, Darwin scholar Michael Ghiselin felt compelled to intervene; he dismissed the idea that Darwin has brought back teleology into science as a myth. Six years later, Aristotle scholar John Lennox replied: ‘Darwin *was*
I propose to resolve this confusion by distinguishing several notions of teleology. Darwin’s theory allows us to explain organization and adaptation without appealing to directionality in individual development (internal teleology) or the hand of a creator (external teleology). However, in many cases it makes sense to speak of what a trait is (maintained) for in the population. This ‘