I’m of course very flattered you’re here reading this page but I’m not particularly comfortable writing about myself so for the moment, please feel free to look at my LinkedIn (professional) profile and my personal Twitter feed.
- Twitter (@jamesamatthews)
Greetings.
In reading your excellent articles on neural network back propagation, I noticed an equation that has confused me. Sorry if I have this wrong: is it possible that in this article (http://www.generation5.org/content/2002/bp.asp), the derivative equation should read
y’ = y(1-y)
or perhaps even
y’ = y(x) * (1-y(x))
rather than
y’ = x(1-x)
?
Here is a reference link: http://mathworld.wolfram.com/SigmoidFunction.html.
If this is my misunderstanding, my apologies for wasting your time.
Thanks again for these articles.
Yes, you’re absolutely right! 🙂 Sorry, that has been pointed out to me before but I’ve never got around to correcting it. Glad you found the Gen5 article useful.
Any chance of you doing a FB page so I can follow that? I don’t use twitter and like aggregators so I don’t have to visit a lot of individual sites. FB works for that, when there are no RSS feeds available. Looking forward to you spending more time at generation5 when that happens again.
Mike
If you decide to put back up http://www.generation5.org/content/1999/aihistory.asp
please let me know the new URL as I have been citing it.
Jonathan Grudin (jgrudin@microsoft.com)
Hi, I’ve found a refrence to an article you posted a while ago. Seems to not be available anymore (assume an old web site?)
Any change you have it available? The link i have is http://www.jamesmatthews.me/content/2004/noReco.asp
Hi James, I’ve been directed to read an article you had on your old site regarding homemade speech recognition (http://www.jamesmatthews.me/content/2004/noReco.asp). I was wondering if you had the article available still?
Stephen