Rapid notes on solvers - algorithms for optimising a black-box function. This is probably all in textbooks, but i don't have a textbook. I also don't know what these are formally called - optimization algorithms, or iterative somethingorothers.

- Newton's method, Newton-Raphson method, etc; need to know the derivative, or an approximation to it, of the function wrt each parameter, i think, although since there are applets to do it, unless they're doing symbolic differentiation (which is not entirely trivial), that's not true. Okay, yes, you can use finite differences to approximate the derivatives - it's obvious that you can, less obvious to my simple brain that it will work, but apparently, it's kosher.
- BFGS. DFT.
- Impressive-sounding names.
- Boom!
- Baby!
- Check out NumPy and Octave's python interface
- Numerical, sorry,Simple Recipes in Python
- ScientificPython