Notes: 9/24

Red-Herring Fallacy: Irrelevant topic presented to divert attention from relevant topic.

Example: Graduate study and Budget

Undistributed Middle Fallacy: People assume when A = C and B = C, than A = B, but this is not true.

Examples: A = Teenagers, C = 2-legged, B = Ostriches

A = Living, C = Humans, B = Plants

Handout: Termites inspire paper pusher

-Discussion on presentations

-Question Handout

Is trying to learn about intelligence by building something mechanical based on a human brain which is not mechanical contradictory?

Are the people doing this claiming intelligence as some sort of discovery?

-Bomb-on-the-cart problem: Can it be solved with consistency of sensors and being in close touch with immediate environment?

What about processing versus if/then solutions? Robots must go through step one to get to step two, we don’t (short-term to get to long term versus long-term always in mind).

And what about self-correction?

Homunculus Problem: The “little man” in our head running the show. This is a fallacy: circular processing, never answer questions.

Where would we go in our thoughts on intelligence of humans and computers if there was a homunculus - would that be the end of it except trying to reinvent that in a computer?

Mental or physical autonomy versus situatedness: different levels of each and how related to intelligence?

-100% autonomy can never be achieved

-Self-sufficiency in robots versus humans: Finishing tasks and making choices not to versus breaking down