[go: nahoru, domu]

Jump to content

Talk:Apollo Guidance Computer

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by 27.253.192.65 (talk) at 15:21, 3 April 2013. The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

WikiProject iconComputing B‑class High‑importance
WikiProject iconThis article is within the scope of WikiProject Computing, a collaborative effort to improve the coverage of computers, computing, and information technology on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
BThis article has been rated as B-class on Wikipedia's content assessment scale.
HighThis article has been rated as High-importance on the project's importance scale.
Taskforce icon
This article is supported by Computer hardware task force (assessed as High-importance).
WikiProject iconAstronomy B‑class Mid‑importance
WikiProject iconThis article is within the scope of WikiProject Astronomy, which collaborates on articles related to Astronomy on Wikipedia.
BThis article has been rated as B-class on Wikipedia's content assessment scale.
MidThis article has been rated as Mid-importance on the project's importance scale.
WikiProject iconSpaceflight B‑class High‑importance
WikiProject iconThis article is within the scope of WikiProject Spaceflight, a collaborative effort to improve the coverage of spaceflight on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
BThis article has been rated as B-class on Wikipedia's content assessment scale.
HighThis article has been rated as High-importance on the project's importance scale.

CCS instruction

Fascinating article. Would love to know what the thinking is behind the CCS instruction - what sort of problem does it solve? What would be a modern (C, say) equivalent contruction? GRAHAMUK 04:51, 2 Sep 2003 (UTC)

The CCS instruction can be used to perform the equivalent of the C language "if" statement, "switch" statement, "for" loop, or "while" loop. I was going to put a code fragment in here as an example, but the editor really mangles the formatting... --Pultorak 07:29, 4 Sep 2003
The code fragment would be interesting, and could be adequately formatted in Wikipedia by putting a space first in every line. If applicable, you might get it back via the Page history link and put it back in. --Wernher 00:34, 3 Mar 2004 (UTC)
Designer's comment: I could probably write a small book on CCS alone. Let me know if my amplifications in November left anything unanswered. Did I mention that the basic idea was lifted from the IBM 704/7094's CAS (Compare Accumulator with Storage)? --67.75.8.160 02:46, 1 Mar 2004
First: a question before going into the subject matter proper: does "designer" in the above comment by any chance relate to "designer of the AGC and/or its instruction set"? Just wondering. And now to the question at hand: maybe the 7090/94 heritage should be mentioned in the article, in the 'quest' of educating readers about the sometimes hidden, and of varying degree and significance in each case, but still often very fascinating, continuity underlying much of computer development history? --Wernher 00:21, 3 Mar 2004 (UTC) / 21 Apr 2004

Single/double precision

The "single" and "double" precision mentioned in the article links to IEEE definitions, but surely the author meant one or two of the 15/16-bit words the machine used? --Anonymous, 21 Apr 2004

Thanks for mentioning it, I have to admit I didn't think of that when inserting the links. :-) However, the opening paragraph of the respective 'IEEE articles' does actually define sgl/sbl precision fairly generally (i.e. using one vs two words), so hopefully we'll avoid misleading the readers too much. Eventually I think we should 1) make a separate article on general single/double precision numbers, and 2) incorporate even clearer introductory info, w/links to the general article, in the IEEE definition articles. --Wernher 16:13, 21 Apr 2004 (UTC)

User interface

In this chapter there is a link to a picture and also to a diagram of the DSKY. The two do not match. [1] has there indicators:

+------------+------------+
|UPLINK ACTY |  TEMP      |
+------------+------------+
|  AUTO      |GIMBAL LOCK |
+------------+------------+
|  HOLD      |   PROG     |
+------------+------------+
|  FREE      | RESTART    |
+------------+------------+
| NO ATT     | TRACKER    |
+------------+------------+
|  STBY      |  [   ]     |
+------------+------------+
| KEY REL    | OPR ERR    |
+------------+------------+

Compare that to

LM DSKYs interface diagram.
+------------+------------+
|UPLINK ACTY |  TEMP      |
+------------+------------+
| NO ATT     |GIMBAL LOCK |
+------------+------------+
|  HOLD      |   PROG     |
+------------+------------+
| KEY REL    | RESTART    |
+------------+------------+
| OPR ERR    | TRACKER    |
+------------+------------+
|  [   ]     |   ALT      |
+------------+------------+
|  [   ]     |   VEL      |
+------------+------------+


I believe the top is Block I, the bottom is Block II LEM (the CSM doesn't have ALT and VEL indicators).

Points of reference

For contemporary readers who may not have been alive at the time, or who may not be intimately familiar with computer architecture, the Apollo Guidance Computer could use a comparison to later well-known computers or calculators. For example, how would it compare to the HP-65 programmable calculator, or a personal computer, such as the Apple II or IBM PC that came a decade or so later? Quicksilver 02:13, 11 November 2005 (UTC)[reply]

Good point! Perhaps we should also put in the TI-83/89 and HP-48/49g+ modern calculators, since those would be quite well-known to many college students (and engineers, scientists) today. --Wernher 17:10, 13 November 2005 (UTC)[reply]
This the exact reason that I came looking for the Apollo computer information. We've heard for years that the cheap $4 calculator in the checkout line has more computer power than the one that went to the moon. So how much did it really have? I missed any reference to RAM or ROM. No changes in four years may not be a good sign. :) -- Kearsarge03216 (talk) 00:39, 27 March 2009 (UTC)[reply]
It's clearly put forth in the article (Apollo_Guidance_Computer#Description) but the architecture and terminology were not altogether the same as today. A quick reading of the text shows the computer's RAM was about 2K and ROM was about 36k, along with memory add-ons. This was (barely) enough for the straightforward, skived-down guidance math it was built to do, but packed into a startlingly small area for the time. The interface was all hardware. It was indeed less powerful than advanced handhelds made only a few years later, much less powerful than the earliest IBM PCs but highly reliable. The AGC was more or less hand built, very expensively. $4? More like $1-2 when on sale at an office supply store: Most throwaway calculators in 2009 would be more powerful over all, but maybe less reliable. Any article comparison with calculators or PCs would need to be reliably sourced, though. Gwen Gale (talk) 04:33, 27 March 2009 (UTC)[reply]
As a point of comparison, emulating the AGC using the Virtual AGC software takes about 2% of CPU time on a 3GHz Pentium-4 (i.e. that CPU can emulate the AGC at around 50x realtime). And given that it's probably running a hundred or more x86 instructions to emulate each AGC instruction, the AGC is less than 0.1% of the performance of the P4. Which makes you think, really, when you consider that it got men to the moon and back, with assistance from the ground. Mark Grant (talk) 07:45, 28 March 2009 (UTC)[reply]
This does seem to line up neatly with the 2.048 MHz clocking speed of the AGC noted in the article. I've got spreadsheets which, from my human outlook, seem to do complex calculations "instantly," though these would have taken several seconds on the AGC (given it had enough memory to hold the data, which it did not), never mind the staggering overhead of the X Window GUI along with all those daemons and such. There is likely a source floating about somewhere which talks about this and could be cited in the text. Gwen Gale (talk) 08:30, 28 March 2009 (UTC)[reply]

What, specifically, was it used for?

This is a fascinating article, but I find myself wondering what exactly this computer was used for on its missions. The article says that it was used to "collect and provide flight information, and to automatically control all of the navigational functions of the Apollo spacecraft," but I'd like to see a more detailed explanation than that.

For example, what kinds of "flight information" were collected? What were the "navigational functions" of the spacecraft? Presumably the computer did not "fly the spacecraft" in a completely automatic manner. I would be interested to know more about in what way the computer was used by the astronauts to operate the craft.

DrDeke 15:29, 8 March 2006 (UTC)[reply]

For a thorough explanation, there's a 500-odd page Delco manual covering the programs used by the AGC on the Apollo 15 mission at: http://history.nasa.gov/alsj/a15/A15Delco.pdf.
While the AGC didn't fly the spacecraft completely automatically (e.g. it didn't work out when the engines needed to fire to take it to the Moon, but if given that information by the crew it could fire the engines and control the burn), it was capable of a completely automatic landing on the Moon. Typically the AGC flew the LEM until a few hundred feet above the ground, then the astronauts would use the LEM controls to adjust the programmed landing site to ensure they were going to land on a flat area and not in a crater.
The Virtual AGC page at http://www.ibiblio.org/apollo/index.html has an AGC emulator and some of the real software which ran on it. There's a video of the Virtual AGC flying a simulated Apollo CSM at http://mysite.wanadoo-members.co.uk/ncpp/CSM_DAP.wmv
The video isn't terribly exciting as it's just firing the RCS thrusters to rotate the CSM to the specified orientation (45 degrees pitch and 90 degrees roll). It does give some indication of how the real AGC was used by the astronauts though. MarkGrant

The article still suffers from a certain lack of focus in the introduction. A naive reader could be forgiven for thinking that this was an article about embedded systems, as the lede leads with "The Apollo Guidance Computer (AGC) was the first recognizably modern embedded system". The focus at this point should be on what the AGC was and did. I'm looking at the article now (I'm in the middle of rereading Mike Collins's book "Carrying the Fire" (still my favourite after 30 some years, even with the advent of Chaikin's etc), Aldrin's "Men from Earth" and Don Eyles's "Tales from the Lunar Module Guidance Computer", hence the interest) and will make some change to effect this in the next hour or so. Lissajous (talk) 14:17, 27 August 2009 (UTC)[reply]

I've added a better (I think) lede.--agr (talk) 14:52, 27 August 2009 (UTC)[reply]

Integrated Circuits

We should probably add more information about the decision to use ICs in the AGC rather than discrete transistors. I've added links to some documents on the klabs.org site discussing this decision and it would appear to have been highly contentious at the time but extremely sensible in hindsight. MarkGrant 02:11, 9 July 2006 (UTC)[reply]

PGNCS trouble

This is an excellent article, but I question the assertion that it was the program alarms that caused Neil Armstrong to go to manual control of the Apollo 11 landing. Is there a source? As far as I know, all of the astronauts went to manual control during lunar landings, and I've never seen Armstrong's decision singled out like this before. --MLilburne 09:16, 13 July 2006 (UTC)[reply]

It does say 'more manual', but it seems odd to me too. BTW, I've also added a comment on the root cause of the 1201 alarms, which I didn't see mentioned anywhere else. Mark Grant 16:53, 13 July 2006 (UTC)[reply]
There's a good discussion of manual control and lunar landings (though on a message board) here. What "manual control" would mean in this context is, I believe, going to P66, which all of the commanders seem to have done. So I do think that the article is wrong. But I'm going to ponder a bit more before changing anything. --MLilburne 17:06, 13 July 2006 (UTC)[reply]

In First on the Moon', (Little Brown, 1970) Armstrong says he took manual control when he realized they were about to land in a boulder field. The alarm problems were an issue becaused they distracted him from looking out the window and following landmarks, but it was the realization that they were heading for a poor landing spot that caused him to take over the throtle ontrol so he could slow the rate of decent and allow more time at a higher altitude where he cold select a better spot.--agr 15:00, 8 December 2006 (UTC)[reply]

The 1201 and 1202 alarms were caused by too many events from the Rendezvous Radar (part of the Abort Guidance System) which was attempting to track the CSM in case the abort switch was pressed during decent. This Rendezvous Radar was disabled during decent in subsequent missions which is why you never hear about it after Apollo 11. --Neilrieck 03:04, 10 February 2007 (UTC)[reply]

Standby Mode

Somebody needs to correct this. It can't have been 5 to 10 kW reduction. More like tens of Watts.

http://history.nasa.gov/ap16fj/csmlc/a16Lemer1-7.gif says that some sort of standby (I don't know if it's the same) reduced consumption by 3 Watts. Does someone know if it's the same as the one mentioned in article?

Presumably it's the same. The text has been corrected already to read W not kW.--agr 14:44, 8 December 2006 (UTC)[reply]


The article says, that the standby mode was never used. Is this really true? I can't believe this. What about Apollo 13? —Preceding unsigned comment added by 89.27.200.16 (talk) 12:37, 24 July 2009 (UTC)[reply]

Description section

The photo is nice but gives no indication of scale. Could anyone add to the description section indications of Power usage and Physical dimensions? Garrie 03:01, 12 December 2006 (UTC)[reply]

Technical & Generalize tags

I added the technical and generalize tags. This page is far too technical. This is obviously an amazingly important technology, but I don't know that describing clock cycles, circuit design, and other technical specifications are the best ways to convey it. Can we get more about the historical context (state of IC computers in the era), advancements that the computer made that can still be seen in modern computers, how the computer affected or resolved missions-in-flight (ie Apollo 13)?Madcoverboy 16:02, 11 March 2007 (UTC)[reply]

First?

The description section says, The Apollo flight computer was the first to use integrated circuits (ICs), and then later on, The decision to use a single IC design throughout the AGC avoided problems that plagued another early IC computer design, the Minuteman II guidance computer. If there was an earlier design, how could this be the first? -- RoySmith (talk) 20:28, 26 June 2007 (UTC)[reply]

It says 'another early design', not an earlier design. They were developed at around the same time, I'm not sure which came first. Mark Grant 22:28, 26 June 2007 (UTC)[reply]
I read the sentence as meaning, "Based on the experiences of the Minuteman II computer, the AGC design team decided to go with a single IC desgn". Maybe the sentence just needs rewriting to avoid giving that impression. -- RoySmith (talk) 15:11, 27 June 2007 (UTC)[reply]
Sure, I'd agree it's a bit confusing. Mark Grant 15:23, 27 June 2007 (UTC)[reply]
Here's one site claiming it was the first: http://www.ieee-virtual-museum.org/collection/event.php?id=3457010&lid=1 Mark Grant 15:06, 27 June 2007 (UTC)[reply]
And comments by Henry Spencer, who usually knows his stuff with anything space-related: http://yarchive.net/space/politics/nasa_and_ICs.html
This may come down to a question of how you define 'first': the AGC was probably the first IC-based computer to go into development (MIT were ordering ICs in February 1962), but may not have been the first to fly (e.g. I haven't found a date for the first Minuteman-II launch). Mark Grant 15:23, 27 June 2007 (UTC)[reply]

Looks to me like it was the Minuteman II that was the first successful all-digital flight computer [2], the first MM2 test flight occurred in September 1964. [3] Banjodog (talk) 05:53, 27 January 2009 (UTC)[reply]

Misc

This is already in the external links section. Gwen Gale (talk) 01:17, 28 December 2007 (UTC)[reply]

Error 1202

Can you tell whether it is this Apollo Guidance Computer (AGC) that displayed an "error 1202" during Apollo 11 landing on the moon and forced Armstrong to moonland manually ?

Yes. See, for example, http://history.nasa.gov/alsj/a11/a11.1201-pa.html Mark Grant (talk) 00:41, 25 July 2009 (UTC)[reply]

The MIT AGC Project link is effectively dead --

The Burndy Library has moved to the Huntington Library, Art Collections, and Botanical Gardens in San Marino, California. The Dibner Institute, formerly on the MIT campus, is now closed. Information regarding the Burndy Library and Dibner Fellowships may now be found at http://huntington.org/burndy.htm. Inquiries may be sent to publicinformation@huntington.org.

Some of the material is archived at http://authors.library.caltech.edu/5456/, but I haven't tracked down all of the items from the external links yet. Autopilot (talk) 00:13, 5 March 2008 (UTC)[reply]

Verb/noun commands

Are there any examples of how the verb / noun command inputs worked? I can see the codes in the picture of the side panel, but its not clear what they are for or how they are used.

-Bill —Preceding unsigned comment added by 75.180.8.80 (talk) 12:10, 26 July 2009 (UTC)[reply]

See the external links for the article but page 17 of this pdf puts forth a quick overall take on the noun/verb buttons. Gwen Gale (talk) 12:55, 26 July 2009 (UTC)[reply]
Looking at the "helpful" list of verbs and nouns, now I think Robert Heinlein was closer to the mark than I thought in his story Misfit, where the navigators have to convert everything to binary - by hand! - before entering it into the computer. The DSKY must have been a brutal man-machine interface to use, by today's standards. --Wtshymanski (talk) 16:04, 2 April 2010 (UTC)[reply]

Units - kibibyte

Disclaimer: I didn't introduce the kibi/mibi nomenclature.

Has there been prior discussion on the units used in the article? A recent edit has removed references to KIBI and MIBI (see kibibyte) and replaced them with the more widely used (but arguably misleading) "kilo" and "mega". The original references were in line with the (not widely used) IEC standard for referring to powers of 2 (binary). I was myself tempted to make the same change, but left alone on the basis that there might have been good reason/prior consensus, and moreover the IEC designations are in some small way more precise.

Is there good argument for not using the kibi/mibi nomenclature? Lissajous (talk) 08:40, 4 September 2009 (UTC)[reply]

On reading Anakin's response on my talk page (summarized below) - I'm won over to sticking with terminology that's already widely used - i.e., kilobyte etc.
It's somewhat a matter of personal taste, but I think there are good arguments to using the standard kilobyte/megabyte/gigabyte terms. There has been a lot of discussion about it on Wikipedia, and WP:MOSNUM suggests to use kilo/mega/giga as they're better understood than kibi/mibi/gibi, though it does offer some exceptions. Seems to me that in every article using the IEC prefixes, they're hyperlinked to a long page explaining their meaning and history and justification. The added confusion doesn't seem at all helpful to readers, having to force new words upon them before they can continue reading the original article (though it should probably be decided on a per-article basis). • Anakin (talk) 17:57, 4 September 2009 (UTC)[reply]

Lissajous (talk) 18:37, 4 September 2009 (UTC)[reply]

Block I vs Block II

I'm in no great hurry, but at some time I'd like to move the focus of the article to be on the Block II design (which actually flew the manned missions), with the Block I taking a lesser role. At present the text describes the Block I by default, with references to changes for the block II made later. The design history is interesting, and the evolution from Block I to Block II is important, but the design of interest is in fact the Block-II version. Are there good reasons for not doing this? Lissajous (talk) 05:30, 11 October 2009 (UTC)[reply]

These links actually seem to be working now. Added 'nb.' on size of downloads. --220.101.28.25 (talk) 11:48, 31 October 2009 (UTC)[reply]

They work for me too, so I removed the 'dead link' tags. I also changed the 'section headers' for the links section to wiki headings since they're not very obvious when they're just part of the generic text; that's bugged me for a while and I think this is an improvement. Mark Grant (talk) 20:16, 31 October 2009 (UTC)[reply]

Overwriting software

"The software could be overwritten by the astronauts using the DSKY interface. As as done on Apollo 14." - this seems rather dubious to me as the software was in fixed core memory, and the only explanation I've found of AGC hacks on Apollo 14 is this:

http://www.ibiblio.org/apollo/#Solution_to_the_Apollo_14_Final_Exam

So in that case it would appear that the astronauts were changing variables in the erasable memory rather than code.

I'm sure I do remember a mission installing a software 'patch' in erasable memory but can't find any reference to it now. And that's still not really 'overwriting' the software. Mark Grant (talk) 05:17, 15 September 2010 (UTC)[reply]

Dimensions

At the top of the article it mentions that the dimensions of the AGC are 24" x 12.5", however from the pictures it looks more squarish, and I recall it being more like 8" x 8". Is there a source to confirm dimensions? Or maybe I am thinking of just the DSKY part of it? Logicman1966 (talk) 23:48, 9 February 2011 (UTC)[reply]

You're thinking of the DSKY. Here is a ref for the Block I AGC dimensions, which are a little different from the dimensions given in our article: http://www.nasm.si.edu/collections/artifact.cfm?id=A19720340000 -agr (talk) 01:10, 10 February 2011 (UTC)[reply]

In layman's terms...

> Block II had 32 kilowords of fixed memory and 4 kilowords of erasable memory.

I'd like to explain this in terms an ordinary person will understand - would it be fair to say: "approximately as much memory as a Comodore 64" ?

Regards, Ben Aveling 10:57, 7 October 2011 (UTC)[reply]

Standby

This is kinda confusing. So according to what it says now, it sounds as if full on used 70W and standby used 65W or 60W -- and I'm not sure that's correct. Also, can we get an inline reference there? 31.16.108.201 (talk) 23:38, 26 February 2012 (UTC)[reply]

Factual errors

In the 1960's I was an engineer who worked on the AGC at Raytheon. I noticed four minor factual errors:

1. There were actually 2 different integrated circuits in the AGC. As stated, one was the dual RTL 3-input nor gates used for the logic (made by Fairchild). In addition the menory used an integrated circuit sense amplifier (made by Norden).

2. The packaging of the ICs was not in flat packs, but metal TO-5 can-style packages (I forget the number of leads). I believe this was done for cooling reasons. The AGC was conduction cooled (no air in space) and the individual modules had magnesium headers with holes for the TO-5 cans, resistors, and other components.

3. Although wire wrap was used in the backplane connections (for module-to-module connections), within the (epoxy) potted modules the interconnections were welded wire. (Solder connections were viewed as too unreliable).

4. I challange the photo of the erasable core memory. The erasable memory was not built of individual planes as pictured, but was a "folded" design which was also potted to form a module like the others, but in a "silastic" type material since the epoxy was too rigid and the cores couldn't stand any compressive force. Interestingly, each core had 4 tiny magnet wires threading through it. The requirement of no solder connections also applied to those magnet wires each of which needed to thread several hundred tiny cores. In the process of manufacturing the "folded" stack, sometimes cores needed to be replaced (because they were defective), which meant painstakingly removing and replacing the 4 wires involved. In commercial memories where solder splices are allowed it's easy, but for the AGC memory the 4 wires needed to be replaced and re-strung. — Preceding unsigned comment added by 72.28.170.2 (talk) 19:01, 8 July 2012 (UTC)[reply]

EDRUPT Instruction

Found this from page 95 of the book "The Apollo Guidance Computer: Architecture and Operation, ed:Frank OBrien" "Communicationg with the outside world :the IO system , The EDRUPT instruction"

It appears to explain how the EDRUPT Instruction was used. For disabling interupt, and appears to have been used for implimenting self diagnostic tests.

When EDRUPT is run, all other intrupts are disabled, and ZRUPT register loaded with value of Z (program counter) , and during the LM autopilot , code for terminating the DAP cycle is run, then RESUME instructoin is run at the end to re-enable interupts.

so the like (not exact word-to-word, just extracting how the Instruction is __supposed_ to work) Not verified on the AGC emu yet, but this appears reasonable. 27.253.192.65 (talk) 15:21, 3 April 2013 (UTC)[reply]