Keith A. Pray - Professional and Academic Site
About Me
·
·
·
LinkedIn Profile Facebook Profile GoodReads Profile
Professional
Academic
Teaching
                                          
Printer Friendly Version

Intro ] [ 30-Strain Gauge ] [ 31-Grammar ] [ 32-Config GA ]

Up: AI in Design ]

Critique: Grammatical Design
Ken Brown *

      As I got further into this article, I started to lose focus on the main topic. I had to go back and confirm whether it was about designing grammars, grammars for design or designing grammars useful in design. This was strange since the idea of using a grammar to represent a search space and methods by which to move about in that space seemed very intuitive to me. I thought it was clever to cast design in a light that many computer scientists would instantly recognize. It was nice to see AI search techniques referred to as "standard". While the previous sentence has very little to do with the rest of the paragraph, it just didn't fit anywhere else and I didn't want it to be lonely.

      Shape annealing, as described, sounds a lot like genetic algorithms. The main difference being there is a population of only one, resulting in a lack of cross-over for population member generation. Everything from the random application of changing rules to the numeric evaluation of the current quality and the slightly random back-tracking of poorly rated designs seems analogous to population generation, evaluation and choosing members of a population to continue into the next (choosing between the member of the current population and the previous population.) Shape annealing also reminded me of a random walk search for many of the same reasons.

      One thing to note concerning the possible problem of grammars limiting the creativity possible with such a system is that natural language has had very little, if any, limiting effect on human creativity. Maybe a brief description of the differences and similarities between computer language grammars and natural language grammars would help explain this better.

      In the expanding the model of design side bar, it is said that shape annealing adapts its performance during search. The main mechanism for this reducing the randomness used to decide whether or not to backtrack based on the current rating. Despite the system not saving any knowledge from this change in control during search, it is called learning. The system does not improve its performance over time. Maybe there are some aspects of the system that were not presented or that I didn't understand, but I would say this is not machine learning. Earlier I said that shape annealing reminds me of genetic algorithms, which is generally accepted as a machine learning technique. The main difference between these two is that the shape annealing system described does not save anything to better solve a future problem.


* Ken Brown, Grammatical Design, In IEEE AI In Design, March-April 1997.

Intro
01-DPMED
02-Dominic
03-DSPL Air-Cyl
04-Pride
05-COSSACK
06-MICOM-M1
07-Configuration Survey
08-Dynamic CSP
09-MOLGEN
10-Failure Handling
11-VT
12-Conflict Resolution
13-Cooperative Negotiation
14-Negotiated Search
15-Multiagent Design
16-Prototypes
17-CBR Survey
18-PROMPT
19-A Design
20-Bogart
21-Cadet
22-Argo
23-Analogy Creativity Survey
24-Algorithm Design
25-AM
26-Edison
27-LEAP
28-Plan Compilation
29-ML Survey
30-Strain Gauge
31-Grammar
32-Config GA
33-Functional First
34-Functional CBR
35-Functional Survey
36-Models
37-First Principles
38-Config Spaces
39-Task Analysis

by: Keith A. Pray
Last Modified: August 13, 2004 8:03 PM
© 2004 - 1975 Keith A. Pray.
All rights reserved.

Current Theme: 

Kapowee Hosted | Kapow Generated in 0.009 second | XHTML | CSS