It is early on Saturday morning and I’m buzzing. No, it’s not due to the second cup of coffee I’m enjoying right now. I awoke around 3 a.m. and tried without success to go back to sleep. Resigned to staying awake for a while, I grabbed the TV remote and began flipping through programs. There were a number of news talk shows airing the latest flap in the Democratic Presidential race and I finally found an episode of The Brady Bunch to dull my senses, hoping to relax me back to sleep. However, when that show ended, it was 4 a.m. and I began surfing channels again.
I then stumbled upon a program that instantly energized me. When I heard the host say that he had been trying to get his next guest on for the last ten years, I became curious. But, when he announced his guest’s name, I knew I was going to be up for awhile and was beyond going back to sleep. I got up and put on the coffee so that I could take it all in.
I’ll bet that my ears visibly jumped. I was immediately pumped up when I heard that Ray Kurzweil was the guest on the
Glenn Beck show
on CNN Headline News. This was a repeat of the show's original broadcast from the night before, Friday, May 30, 2008.
I don’t know if CNN airs entire one-hour shows such as Beck’s online, but if they don’t, they should make an exception for this one. It is a riveting exchange with this fascinating man. However, if you missed the program and cannot see it in any re-airing, you can
Read the show’s transcripts here.
If you are not familiar with his name, Kurzweil is an author, inventor, futurist, and man of many other appropriate titles, but most of all, I believe that he’s one of the smartest men of our time. His name is synonymous with assistive technology. One of the first pieces of AT that I purchased was a Kurzweil Reading Edge, a stand alone optical character recognition (OCR) scanner that sold for more than $5,000 in late 1994. I remember that being the year when I made that purchase, because I made it shortly after leaving the Criss Cole Rehabilitation Center, which is where I first learned about that innovative machine.
Kurzweil is in high demand as a speaker at numerous conferences and events whose subjects range on a variety of issues from music to science. While the conferences might vary, his focus always centers on information technology. If you didn’t catch it in my recent post, He was one of the two noted keynote speakers at this year’s CSUN.
I mentioned that I had purchased my Reading Edge in 1994. That might have sounded like cutting edge technology for that year, but that invention was actually already two decades old at that time. Kurzweil invented the world’s first character recognition machine that was the size of a washing machine and sold the first model in 1976. Watch the interview or read that transcript to hear how that whole idea evolved. I think you’ll find it interesting.
(Extra credit if you can guess the name of the first customer of this product. Read to the end of this post for the answer.)
Later in the Beck show, Kurzweil demonstrated the latest incarnation of his reading machine, the
KNFB Reader.
He scans a document and lets the product do its thing, reading on the show for the entire world to see and hear. Kurzweil does promote the KNFB Reader for its dual purposes of assisting people who are both blind/visually impaired and dyslexic. To emphasize this aspect, he shows the phone’s screen to demonstrate how the words being read from the scanned document are highlighted, and how the highlights move as the speech progresses.
Thing about what that says. From a behemoth, washing machine model to a four-ounce cell phone that can be held in the palm of your hand in just over 30 years…Wow!
To understand the evolution of that OCR system is to understand most of what Kurzweil discusses. Exponential growth is key to understanding how things evolve in our world. Kurzweil is a highly intelligent man, but in this interview, he doesn’t come across as an overly intellectual person speaking in language that the layperson can’t understand. With Beck providing the questions, Kurzweil explains his understanding of life and technology in very understandable terms.
Don’t even get the idea that Kurzweil is one-dimensional. Kurzweil speaks of many things during this program and assistive technology is just one of them. I mentioned that he is a futurist. Don’t confuse that with some flaky person who says they are a fortuneteller. Futurists examine history, technology, and trends on several planes and make predictions from the understanding gleaned from that examination. Kurzweil’s been doing this for thirty years and has been pretty accurate on many fronts.
Here are a few points from this interview I find totally intriguing.
* Exponential growth is based on numbers doubling and the time needed to do this.
* The energy produced by the sun is more than 10,000 times to power needed to support our world today.
* Solar power technology is doubling every two years.
* We are only five years from solar power being more affordable than current coal or gas.
* We are only seven doublings away from solar energy being the dominant power supply for our earth.
If you can find it, do check out the interview. You’ll discover that there is a good discussion of health and the integration of technology with our bodies, as well as a good delving into the subjects of fossil fuels, computers, information storage, global economics, and even briefly touches on the applications of information technology as it applies to terrorism. You'll also learn a little bit about Kurzweil himself.
And, here’s your extra credit answer: Kurzweil sold his first character recognition machine to Stevie Wonder. The two forged a friendship from this that led to further collaboration. The offspring of this work was the invention of the world’s first electronic piano.
By the way, my coffee’s now all gone and I’m still buzzing.
Saturday, May 31, 2008
Thursday, May 29, 2008
Assistive Technology Blog Carnival is open for business
Greetings, fans of assistive technology.
I’m here today to let you know that the latest edition of the
Assistive Technology Blog Carnival
Is up for public consumption. The best news is that you don't need any tickets to enter the carnival. Better yet, there is no inaccessible CAPTCHA keeping posters from providing information.
This edition of the carnival has entries discussing Natural Reader’s floating toolbar, yakitome, speech synthesis on the Mac OSX, and, my own entry about transitioning between speech synthesizers.
Submissions to the carnival this month were less than last month, but I think that Lon might have it correct that some might have found it to be too limiting to post on the theme of synthesized speech. Next month, though, will be an open forum, so submit your favorite assistive technology entry to Lon at
lonthornburg@nolimits2learning.com
for the June carnival.
I hope you can join us at the carnival.
I’m here today to let you know that the latest edition of the
Assistive Technology Blog Carnival
Is up for public consumption. The best news is that you don't need any tickets to enter the carnival. Better yet, there is no inaccessible CAPTCHA keeping posters from providing information.
This edition of the carnival has entries discussing Natural Reader’s floating toolbar, yakitome, speech synthesis on the Mac OSX, and, my own entry about transitioning between speech synthesizers.
Submissions to the carnival this month were less than last month, but I think that Lon might have it correct that some might have found it to be too limiting to post on the theme of synthesized speech. Next month, though, will be an open forum, so submit your favorite assistive technology entry to Lon at
lonthornburg@nolimits2learning.com
for the June carnival.
I hope you can join us at the carnival.
Tuesday, May 27, 2008
Updated: Xerox includes accessibility on copiers via USB port
If you’ve read any of my previous posts concerning the iPhone, then you probably understand that I believe that this device has helped touchscreens to grow in popularity and this type of input method will, undoubtedly, continue to proliferate in the future technology and electronics landscape. But, that popularity is something that has to be taken in stride, especially when accessibility is not a consideration, such as with the iPhone.
However, touchscreens and their visual display predecessors are not new for operating office machines. If you’ve ever worked around any of the high end office copiers that are out there, you can understand just how inaccessible these can be. Even when I still had sightnearly 15 years ago, there were visual displays on copy machines that one had to be able to see by viewing from a vantage point of standing with one’s head above the machine. Operating a machine with this type of display would pose problems for people with myriad visual disabilities, as well as anybody who uses a wheelchair.
Peter Abrahams has written a great article, with a good discussion of relevant issues, on how
Xerox has implemented USB accessibility
On an assorted selection of their copier products.
The key to accessibility lies in the Xerox Copier Assistant™, software that runs on a compatible Windows machine and connects to the Xerox machine via a built-in USB port. The software allows for providing accessibility to people who have various disabilities, of which Abrahams provides a good overview of as well. Input is allowed via mouse, keyboard, or with the proper combination, voice command.
This article offers a good examination of touchscreens and lists how they provide a superior user experience. Additionally, Abrahams lists how they make good business sense, but balances this all out with how they have been inaccessible in the past. He wraps up by looking at the under-promotion Xerox is giving this landmark achievement and offers some tips to help make the most of what they have done.
Hat’s off to Xerox. You rock.
(Thanks to T. Reid at
Reid My Mind
For the initial information about this news.)
Update: 05/28/08
Thanks to an anonymous commenter, I am reminded that, back in March and July of 2006, the American Foundation of the Blind’s great bimonthly magazine
Access World
Discussed this problem in two articles and offered an accessibility review of some conventional brands of office copiers. While these articles are two years old, I believe the information is still relevant and appropriate to share in this post, so here they are:
March 2006
July 2006
I personally recall reading those articles and don't know why it didn't dawn on me before to include them in this post originally. I appreciate the useful feedback.
However, touchscreens and their visual display predecessors are not new for operating office machines. If you’ve ever worked around any of the high end office copiers that are out there, you can understand just how inaccessible these can be. Even when I still had sightnearly 15 years ago, there were visual displays on copy machines that one had to be able to see by viewing from a vantage point of standing with one’s head above the machine. Operating a machine with this type of display would pose problems for people with myriad visual disabilities, as well as anybody who uses a wheelchair.
Peter Abrahams has written a great article, with a good discussion of relevant issues, on how
Xerox has implemented USB accessibility
On an assorted selection of their copier products.
The key to accessibility lies in the Xerox Copier Assistant™, software that runs on a compatible Windows machine and connects to the Xerox machine via a built-in USB port. The software allows for providing accessibility to people who have various disabilities, of which Abrahams provides a good overview of as well. Input is allowed via mouse, keyboard, or with the proper combination, voice command.
This article offers a good examination of touchscreens and lists how they provide a superior user experience. Additionally, Abrahams lists how they make good business sense, but balances this all out with how they have been inaccessible in the past. He wraps up by looking at the under-promotion Xerox is giving this landmark achievement and offers some tips to help make the most of what they have done.
Hat’s off to Xerox. You rock.
(Thanks to T. Reid at
Reid My Mind
For the initial information about this news.)
Update: 05/28/08
Thanks to an anonymous commenter, I am reminded that, back in March and July of 2006, the American Foundation of the Blind’s great bimonthly magazine
Access World
Discussed this problem in two articles and offered an accessibility review of some conventional brands of office copiers. While these articles are two years old, I believe the information is still relevant and appropriate to share in this post, so here they are:
March 2006
July 2006
I personally recall reading those articles and don't know why it didn't dawn on me before to include them in this post originally. I appreciate the useful feedback.
Friday, May 23, 2008
Congressional opportunities for people with disabilities
I’ve got some worthwhile news to share from the
Day in Washington blog.
The content which follows below is from Day’s blog. I’m providing it here in its original form to spread the word.
RG
Recently, I attended a meeting with Dena Morris, Legislative Director in Senator Durbin’s Office. She wanted to emphasize the importance and need for people with disabilities in Congress. Below is a note from her. I strongly encourage anyone with a disability and an interest in policy to seriously consider her offer. It is a wonderful opportunity and would have a significant impact on increasing the diversity in congressional staff.
From Dena Morris:
Senators Durbin and Harkin are eager to help us change the face of the U.S. Senate, with a staff that better reflects the diversity of the country. Historically, we haven’t done a very good job of finding, hiring and growing people with disabilities in legislative jobs. We’d love your help in turning that around.
We are looking for talented young people, hoping to get started on the Hill, who are willing to learn in an entry level position and grow with the job. Mid-career hires are less frequent, but we’re looking for talent and don’t want to discourage anyone.
In general, we look for are people who are smart, eager and willing to work hard. Political sensibilities are important, along with discretion, judgment and a level head. Communication is critically important, so an ideal applicant will write clearly and be confident and articulate in speaking. As I mentioned last week, you need to be able to juggle several tasks at the same time.
Please feel free to send interested candidates my way and I’ll do my best to either talk with them or help direct them to an office that might be a good fit for them.
Thanks again for your help.
Dena
If you feel you are a good candidate for positions on the Hill, or know someone who is, please contact:
Dena Morris
Legislative Director
Sen. Durbin’s Office
202-224-8466 (phone)
dena_morris@durbin.senate.gov
Day in Washington blog.
The content which follows below is from Day’s blog. I’m providing it here in its original form to spread the word.
RG
Recently, I attended a meeting with Dena Morris, Legislative Director in Senator Durbin’s Office. She wanted to emphasize the importance and need for people with disabilities in Congress. Below is a note from her. I strongly encourage anyone with a disability and an interest in policy to seriously consider her offer. It is a wonderful opportunity and would have a significant impact on increasing the diversity in congressional staff.
From Dena Morris:
Senators Durbin and Harkin are eager to help us change the face of the U.S. Senate, with a staff that better reflects the diversity of the country. Historically, we haven’t done a very good job of finding, hiring and growing people with disabilities in legislative jobs. We’d love your help in turning that around.
We are looking for talented young people, hoping to get started on the Hill, who are willing to learn in an entry level position and grow with the job. Mid-career hires are less frequent, but we’re looking for talent and don’t want to discourage anyone.
In general, we look for are people who are smart, eager and willing to work hard. Political sensibilities are important, along with discretion, judgment and a level head. Communication is critically important, so an ideal applicant will write clearly and be confident and articulate in speaking. As I mentioned last week, you need to be able to juggle several tasks at the same time.
Please feel free to send interested candidates my way and I’ll do my best to either talk with them or help direct them to an office that might be a good fit for them.
Thanks again for your help.
Dena
If you feel you are a good candidate for positions on the Hill, or know someone who is, please contact:
Dena Morris
Legislative Director
Sen. Durbin’s Office
202-224-8466 (phone)
dena_morris@durbin.senate.gov
Labels:
Congress,
Employment,
News,
Senator Durbin,
Washington DC
ACB Radio's Main Menu to host panel on CAPTCHA, alternatives, and accessibility
I, for one, am very glad to hear about the following information and am interested to see what is coming down on the accessibility front.
Anybody interested in internet visual verification systems, such as CAPTCHA and other solutions, like the accessible ReCAPTCHA alternative, will be interested in the
Upcoming Main Menu program
on
ACB Radio.
From the Blind Access Journal post linked above:
“We are proud to introduce our panel of experts and their primary areas of focus: “
“* Matt May from the Adobe Systems Accessibility Team will discuss his 2005 W3C note on the inaccessibility of CAPTCHA.”
“* Luis von Ahn from ReCAPTCHA at Carnegie Mellon University will describe their accessible solution.”
“* Steve Dispensa from PhoneFactor will tell us all about an innovative, telephone based two-factor authentication system. “
“Main Menu can be heard on Tuesday evenings at 9:00 Eastern, 6:00 Pacific, and at 1 universal (GMT) on Wednesday mornings on the ACB Radio Main Stream channel. “
To listen to the show, the link is:
http://www.acbradio.org/pweb/index.php?module=pagemaster&PAGE_user_op=view_page&PAGE_id=8
Props go out to Jeff Bishop and Darrell Shandrow at Main Menu for putting together such a well qualified panel of guests to speak on this important matter.
And, I have one more note on this subject:
Hello, Yahoo mail! Hello, BlogCarnival.com! Do you know about this? More than that, do the clowns at the BlogCarnival even care?
Anybody interested in internet visual verification systems, such as CAPTCHA and other solutions, like the accessible ReCAPTCHA alternative, will be interested in the
Upcoming Main Menu program
on
ACB Radio.
From the Blind Access Journal post linked above:
“We are proud to introduce our panel of experts and their primary areas of focus: “
“* Matt May from the Adobe Systems Accessibility Team will discuss his 2005 W3C note on the inaccessibility of CAPTCHA.”
“* Luis von Ahn from ReCAPTCHA at Carnegie Mellon University will describe their accessible solution.”
“* Steve Dispensa from PhoneFactor will tell us all about an innovative, telephone based two-factor authentication system. “
“Main Menu can be heard on Tuesday evenings at 9:00 Eastern, 6:00 Pacific, and at 1 universal (GMT) on Wednesday mornings on the ACB Radio Main Stream channel. “
To listen to the show, the link is:
http://www.acbradio.org/pweb/index.php?module=pagemaster&PAGE_user_op=view_page&PAGE_id=8
Props go out to Jeff Bishop and Darrell Shandrow at Main Menu for putting together such a well qualified panel of guests to speak on this important matter.
And, I have one more note on this subject:
Hello, Yahoo mail! Hello, BlogCarnival.com! Do you know about this? More than that, do the clowns at the BlogCarnival even care?
Thursday, May 22, 2008
Initial impressions of the pilot site for NLS digital talking book downloads
In case you missed it, a little while back, The excessive cheering I’ve noticed in the blind community got to be too much and I had to find out for myself what this hot, new, accessible media player was all about. I treated myself to a Victor Reader Stream so that I could see for myself if this product was worth all the hoopla.
But, this post isn’t about my affection for this powerful piece of assistive technology. What I want to share here today is news about the
digital talking book pilot program
of the
National Library Service for the Blind an Physically Handicapped.
After all, being able to play these books on the Stream was the final, convincing argument that prompted me to get mine.
This library service, or NLS, as it is commonly referred to, makes books available in an accessible, modified format to approved patrons who have a verified disability which keeps them from being able to read regular pring books. The NLS is a division of the United States Library of Congress, and is supported with federal funding.
For background, more than five years ago, I had quit ordering books from the state talking book library, which includes books from the NLS. The main reason for this was that I was discouraged about the slow speed with which they were adopting digital media. I was tired of fumbling around with those 4-track cassettes and keeping the tapes in sequence for each book I read while I saw digital audio springing up all over the web. It didn’t make sense to me that the patrons of the NLS library still had to fool with these tapes when the rest of the world was quickly making the switch to digital media. Another reason was that when traveling for extended length, I would often pack along three or four books just to ensure I didn’t run out of reading material. I felt it was excessively cumbersome to pack along all these plastic boxes of tapes in addition to the heavy HandiCassette player I had for playback. It just seemed that the digital age was leaving accessible media for the blind far behind.
Instead of the tapes, I chose to listen to books on CD. It wasn't that the CDs were any less of a hassle to tote and keep organized, but I would rip the CDs into mp3 format and listen to the books on my computer. Being that I use a home computer and not a laptop, though, this meant I was only able to listen to the books while on the computer. This was a little restrictive in that sense, but I liked it better than the 4-track tapes. It still didn't give me the portability I longed for, though. And, this is where I jump to the present, and also, to the Stream.
The Stream is one of a small group of authorized, accessible media players that will playback books from the NLS, which was one of the reasons I made the leap and purchased it. (For the other authorized players, read the comment left to this post by Wayne.) I had checked out the web site for the pilot digital talking book program and read that there were more than 10,000 titles already converted to digital format and available for download. That was enough to sway my decision and convince me to make the Stream purchase.
However, because the books will only play on one of these devices that the NLS doesn't issue, means that only people who have purchased one of these accessible media players can use this digital book program. That is true, at least for now. when the NLS goes fully digital and this is no longer a pilot program, to ensure access to all, the NLS will have to provide some form of digital book players to consumers for playing the protected audio files. But for now, they have this ever-growing collection of titles already in a digital format and there is an authorization process to validate the players, so this makes these digital talking book files available today to anybody with one of these players.
The authorization key is emailed to the owner and is specific to that one, unique player. This authorization process requires the NLS to coordinate communication by email with the manufacturer, as well as the user. Once authorized, that player can play any of the digital audio files from the NLS site, which includes magazines. Users are limited to 30 downloads in any 30-day period.
Initially, I must say I’m very impressed with the NLS pilot site. It is searchable by author, title, subject, or NLS catalog number. The entire process has run seamlessly for me and authorizing the Stream to play these files was a simple procedure.
The first book I downloaded from the NLS site was one I have wrote about here previously,
The Short Bus: A Journey Beyond Normal
By
Jonathan Mooney.
(I’ll write more here later with my review of that book.)
Playback on the Stream was a breeze. Even the time when I didn’t lock the keys and accidently bumped some key that stopped the book while I was more than half way through the ten hour work. I pressed play again and it took me back to the beginning of the book. Within seconds, I was able to get right back to the spot I had previously been. The NLS books are designed to allow users to skip by sections or chapters. That is how I was able to move back to where I had been so easily.
There are some aspects of the digital book program that will seem comfortably familiar to anybody who has previously spent any time listening to NLS books on tape. They will quickly recognize the familiar names and voices of narrators from previous talking books. The NLS catalog numbers of these digital books are also numerically identical for these books as their recorded cassette brethren, except these are preceded by the designation DB instead of RC. That makes sense…these are digital books and the taped version are recorded cassettes. Keeping the numbers the same only serves to simplify the process.
I’ve got to hand it to the folks at the NLS; they’ve done something really good here. My former frustration with the outdated, analog 4-track cassettes is today supplanted with an immense joy generated by their digital talking book program. I am once again throwing myself into reading for pleasure.
But, this post isn’t about my affection for this powerful piece of assistive technology. What I want to share here today is news about the
digital talking book pilot program
of the
National Library Service for the Blind an Physically Handicapped.
After all, being able to play these books on the Stream was the final, convincing argument that prompted me to get mine.
This library service, or NLS, as it is commonly referred to, makes books available in an accessible, modified format to approved patrons who have a verified disability which keeps them from being able to read regular pring books. The NLS is a division of the United States Library of Congress, and is supported with federal funding.
For background, more than five years ago, I had quit ordering books from the state talking book library, which includes books from the NLS. The main reason for this was that I was discouraged about the slow speed with which they were adopting digital media. I was tired of fumbling around with those 4-track cassettes and keeping the tapes in sequence for each book I read while I saw digital audio springing up all over the web. It didn’t make sense to me that the patrons of the NLS library still had to fool with these tapes when the rest of the world was quickly making the switch to digital media. Another reason was that when traveling for extended length, I would often pack along three or four books just to ensure I didn’t run out of reading material. I felt it was excessively cumbersome to pack along all these plastic boxes of tapes in addition to the heavy HandiCassette player I had for playback. It just seemed that the digital age was leaving accessible media for the blind far behind.
Instead of the tapes, I chose to listen to books on CD. It wasn't that the CDs were any less of a hassle to tote and keep organized, but I would rip the CDs into mp3 format and listen to the books on my computer. Being that I use a home computer and not a laptop, though, this meant I was only able to listen to the books while on the computer. This was a little restrictive in that sense, but I liked it better than the 4-track tapes. It still didn't give me the portability I longed for, though. And, this is where I jump to the present, and also, to the Stream.
The Stream is one of a small group of authorized, accessible media players that will playback books from the NLS, which was one of the reasons I made the leap and purchased it. (For the other authorized players, read the comment left to this post by Wayne.) I had checked out the web site for the pilot digital talking book program and read that there were more than 10,000 titles already converted to digital format and available for download. That was enough to sway my decision and convince me to make the Stream purchase.
However, because the books will only play on one of these devices that the NLS doesn't issue, means that only people who have purchased one of these accessible media players can use this digital book program. That is true, at least for now. when the NLS goes fully digital and this is no longer a pilot program, to ensure access to all, the NLS will have to provide some form of digital book players to consumers for playing the protected audio files. But for now, they have this ever-growing collection of titles already in a digital format and there is an authorization process to validate the players, so this makes these digital talking book files available today to anybody with one of these players.
The authorization key is emailed to the owner and is specific to that one, unique player. This authorization process requires the NLS to coordinate communication by email with the manufacturer, as well as the user. Once authorized, that player can play any of the digital audio files from the NLS site, which includes magazines. Users are limited to 30 downloads in any 30-day period.
Initially, I must say I’m very impressed with the NLS pilot site. It is searchable by author, title, subject, or NLS catalog number. The entire process has run seamlessly for me and authorizing the Stream to play these files was a simple procedure.
The first book I downloaded from the NLS site was one I have wrote about here previously,
The Short Bus: A Journey Beyond Normal
By
Jonathan Mooney.
(I’ll write more here later with my review of that book.)
Playback on the Stream was a breeze. Even the time when I didn’t lock the keys and accidently bumped some key that stopped the book while I was more than half way through the ten hour work. I pressed play again and it took me back to the beginning of the book. Within seconds, I was able to get right back to the spot I had previously been. The NLS books are designed to allow users to skip by sections or chapters. That is how I was able to move back to where I had been so easily.
There are some aspects of the digital book program that will seem comfortably familiar to anybody who has previously spent any time listening to NLS books on tape. They will quickly recognize the familiar names and voices of narrators from previous talking books. The NLS catalog numbers of these digital books are also numerically identical for these books as their recorded cassette brethren, except these are preceded by the designation DB instead of RC. That makes sense…these are digital books and the taped version are recorded cassettes. Keeping the numbers the same only serves to simplify the process.
I’ve got to hand it to the folks at the NLS; they’ve done something really good here. My former frustration with the outdated, analog 4-track cassettes is today supplanted with an immense joy generated by their digital talking book program. I am once again throwing myself into reading for pleasure.
Access World offers good recap of CSUN 2008
Each year, the Technology and Persons with Disabilities conference is held in California. Perhaps you may know this international gathering better as CSUN, its unofficial name, which is derived from its original host site, the California State University at Northridge.
The American Foundation for the Blind’s online magazine
Access World
Has a good recap of this year’s edition of the event.
This ever-growing event has blossomed from an initial crowd of 200 to more than 4,500 participants in 23 years. The escalating number of attendees tends to tax the creativity of its organizers to find means of hosting the showcase while still accommodating the burgeoning crowds that come to see and hear from professionals in the fields related to assistive technology.
It is one of those events that is of high interest to me personally. Perhaps, next year, I might attend and be able to give some direct feedback. A man can dream, can’t he?
The American Foundation for the Blind’s online magazine
Access World
Has a good recap of this year’s edition of the event.
This ever-growing event has blossomed from an initial crowd of 200 to more than 4,500 participants in 23 years. The escalating number of attendees tends to tax the creativity of its organizers to find means of hosting the showcase while still accommodating the burgeoning crowds that come to see and hear from professionals in the fields related to assistive technology.
It is one of those events that is of high interest to me personally. Perhaps, next year, I might attend and be able to give some direct feedback. A man can dream, can’t he?
Labels:
Access World,
Assistive Technology,
Conferences,
CSUN
Tuesday, May 20, 2008
Updated: MathTalk program integrates assistive technologies to make higher level mathematics accessible
I just learned about the following software that definitely has its place as a tool for students with disabilities.
On its home page,
MathTalk
Proudly proclaims “Do math without keyboard or mouse.”
MathTalk operates with either the Dragon NaturallySpeaking or the Microsoft Speech voice input systems to allow the user to use speech input to write math calculations. And, we’re not just talking basic, four function arithmetic here. MathTalk allows users to correctly write and work through pre-algebra, algebra, calculus, trigonometry, graphing, and statistics problems.
One feature I find particularly interesting is that there is a specific math to Braille program,, employing the Duxbury Braille translator. Using this, students could do their work in MathTalk and export it into a Braille file so that it can be loaded into a Braille notetaker. That's definitely some cool technology integration.
There are several
Videos on the MathTalk web site
demonstrating the program in action. Take a little time and check these out for yourself. Most of the videos are short and won’t take long to watch. They will give you and idea of what the program does and also what it doesn’t do.
I checked out several of the videos and was generally impressed with the MathTalk program. It would take some work for those who are unfamiliar with either of the voice input programs to get the voice files set up and running correctly, but once that was done, MathTalk appears to have a definite niche as an assistive technology.
As one disability does not preclude a person from having another, it is easily possible that somebody who already uses Dragon NaturallySpeaking could also have a Learning Disability that would make this program a great fit for them. The same goes without saying for somebody who is blind and a Braille user. For purely mental processing reasons, the ability to take in the Braille display of the problem and your work would be priceless, I would think. Also, just as well, being blind doesn’t exclude a person from having an LD which could again make berballizing one’s work a very realistic and accessible option.
From watching the last video link on the demo page, which is for MathTalk for Visually Impaired, I don’t believe that the program was working with an additional screen reader. It sounds like the developers have an integrated screen reader that works within the program.
The MathTalk for Visually Impaired program is still under development, so it is difficult to draw concrete conclusions, but, it apparently does not require visually impaired users to be Braille users. This is good, as it is widely reported that only about ten percent of blind people use Braille. Still, the ability to mentally process, verbally state, and then hear your work read back to you is a definite plus when working complex equations. Also, the ability to work through problems using correct mathemathical language is always a benefit.
There are only a few downsides of the Math Talk program that I can see. First, in the event that there is a speech disability, I don’t see where this would be the best alternative path for obvious reasons. Additionally, the user must be sufficiently cognitively sophisticated and able to manage the proper diction with a clear speaking voice. And, blind users will need to get an interface like J-Say to allow their screen reader to work with Dragon NaturallySpeaking first, just to get their voice files working properly before even working with MathTalk. Any of these problems could arise when working with students with disabilities and render this as an unworkable solution. However, for many others, it should be a very realistic solution.
Thanks to Lon Thornburg’s
No Limits to Learning blog
for this information. And, I hope this post answers some of your questions, Lon.
* * * * * * * * * *
Update 05/21/08
The additional information is basically the text of my comment replying to Lon's initial comment. After thinking about it, I feel there are some worthwhile points in the additional thoughts I had. RG
Speaking as a man who was totally blind when taking classes in macroeconomics, algebra, and two semesters of statistics as an undergrad, as well as another year of graduate statistics, I personally know how important it is to have correct phrasing of algebraic and statistical expressions. For that reason, I requested somebody who was knowledgeable in the language, when taking these classes, to proctor my exams as an accommodation.
When using MathTalk to work through problems, that correct expression is one aspect of the program that I see as a strength, but don’t believe they tout strong enough as a feature on the MathTalk web site.
Being I am not a Braille user, I used a Type ‘n Speak to take notes in class. I had my professors read aloud the problems they were writing on the board so that I could write them down. And, because I needed to be able to understand the correct phrasing when later reviewing my notes, I would write these problems out in long hand. One example might be: 325 plus (x) squared, all over (T minus 1). If you’ve ever taken statistics, you know that this is just one part of some of the problems you need to solve and that a complete solution would require an extensive amount of typing of text along with the correct numbers when solved.
With all that said, I again emphasize my point about this program having a lot of value with the blind population that doesn’t use Braille. It gives feedback of your work in a form that is correcty enunciated.
However, I think that also has to be traded off with the need to learn a voice input program in order to use MathTalk. Additionally, that also has the need to implement an integration tool like J-Say to do that. It’s a trade-off, for sure, but I firmly believe that MathTalk presents another option for some people, and it is one that is a better solution than anything that is currently available, at least that I'm aware of.
On its home page,
MathTalk
Proudly proclaims “Do math without keyboard or mouse.”
MathTalk operates with either the Dragon NaturallySpeaking or the Microsoft Speech voice input systems to allow the user to use speech input to write math calculations. And, we’re not just talking basic, four function arithmetic here. MathTalk allows users to correctly write and work through pre-algebra, algebra, calculus, trigonometry, graphing, and statistics problems.
One feature I find particularly interesting is that there is a specific math to Braille program,, employing the Duxbury Braille translator. Using this, students could do their work in MathTalk and export it into a Braille file so that it can be loaded into a Braille notetaker. That's definitely some cool technology integration.
There are several
Videos on the MathTalk web site
demonstrating the program in action. Take a little time and check these out for yourself. Most of the videos are short and won’t take long to watch. They will give you and idea of what the program does and also what it doesn’t do.
I checked out several of the videos and was generally impressed with the MathTalk program. It would take some work for those who are unfamiliar with either of the voice input programs to get the voice files set up and running correctly, but once that was done, MathTalk appears to have a definite niche as an assistive technology.
As one disability does not preclude a person from having another, it is easily possible that somebody who already uses Dragon NaturallySpeaking could also have a Learning Disability that would make this program a great fit for them. The same goes without saying for somebody who is blind and a Braille user. For purely mental processing reasons, the ability to take in the Braille display of the problem and your work would be priceless, I would think. Also, just as well, being blind doesn’t exclude a person from having an LD which could again make berballizing one’s work a very realistic and accessible option.
From watching the last video link on the demo page, which is for MathTalk for Visually Impaired, I don’t believe that the program was working with an additional screen reader. It sounds like the developers have an integrated screen reader that works within the program.
The MathTalk for Visually Impaired program is still under development, so it is difficult to draw concrete conclusions, but, it apparently does not require visually impaired users to be Braille users. This is good, as it is widely reported that only about ten percent of blind people use Braille. Still, the ability to mentally process, verbally state, and then hear your work read back to you is a definite plus when working complex equations. Also, the ability to work through problems using correct mathemathical language is always a benefit.
There are only a few downsides of the Math Talk program that I can see. First, in the event that there is a speech disability, I don’t see where this would be the best alternative path for obvious reasons. Additionally, the user must be sufficiently cognitively sophisticated and able to manage the proper diction with a clear speaking voice. And, blind users will need to get an interface like J-Say to allow their screen reader to work with Dragon NaturallySpeaking first, just to get their voice files working properly before even working with MathTalk. Any of these problems could arise when working with students with disabilities and render this as an unworkable solution. However, for many others, it should be a very realistic solution.
Thanks to Lon Thornburg’s
No Limits to Learning blog
for this information. And, I hope this post answers some of your questions, Lon.
* * * * * * * * * *
Update 05/21/08
The additional information is basically the text of my comment replying to Lon's initial comment. After thinking about it, I feel there are some worthwhile points in the additional thoughts I had. RG
Speaking as a man who was totally blind when taking classes in macroeconomics, algebra, and two semesters of statistics as an undergrad, as well as another year of graduate statistics, I personally know how important it is to have correct phrasing of algebraic and statistical expressions. For that reason, I requested somebody who was knowledgeable in the language, when taking these classes, to proctor my exams as an accommodation.
When using MathTalk to work through problems, that correct expression is one aspect of the program that I see as a strength, but don’t believe they tout strong enough as a feature on the MathTalk web site.
Being I am not a Braille user, I used a Type ‘n Speak to take notes in class. I had my professors read aloud the problems they were writing on the board so that I could write them down. And, because I needed to be able to understand the correct phrasing when later reviewing my notes, I would write these problems out in long hand. One example might be: 325 plus (x) squared, all over (T minus 1). If you’ve ever taken statistics, you know that this is just one part of some of the problems you need to solve and that a complete solution would require an extensive amount of typing of text along with the correct numbers when solved.
With all that said, I again emphasize my point about this program having a lot of value with the blind population that doesn’t use Braille. It gives feedback of your work in a form that is correcty enunciated.
However, I think that also has to be traded off with the need to learn a voice input program in order to use MathTalk. Additionally, that also has the need to implement an integration tool like J-Say to do that. It’s a trade-off, for sure, but I firmly believe that MathTalk presents another option for some people, and it is one that is a better solution than anything that is currently available, at least that I'm aware of.
Court says U.S. Treasury Department discriminates against blind people
I think change is in the air.
And, I’m not talking about loose change. Its bigger than that. The change of which I speak is concerning our paper money here in the United States. You see, an appeals court judge has just ruled that our current system of
paper money discriminates against blind people.
Whether you agree with the lawsuit brought by the
American Council of the Blind
(ACB), or not, the legal arguments made in the suit requesting change are valid.
Of all the major developed countries in the world, only the U.S. has nothing in place to make possible tactile discrimination between different denominations of our paper money. So, the Treasury Department cannot argue that it isn’t possible. Nor, due to the constantly reworkings almost every bill has received in recent years by this department, can they argue undue hardship. And, the legal argument striking down the Treasury Department’s defense of the current system due to the adaptability of most blind people having a way to cope are just as strong.
From the Associated Press article:
“The court ruled 2-1 that such adaptations were insufficient. The government might as well argue that, since handicapped people can crawl on all fours or ask for help from strangers, there's no need to make buildings wheelchair accessible, the court said.”
I may be wrong, but I expect that the Treasury Department will appeal the ruling to the Supreme Court. Even if they choose to accept the current ruling as the right thing to do and quit appealing, change will be slow in coming. Still, I do believe change is in the air.
And, I’m not talking about loose change. Its bigger than that. The change of which I speak is concerning our paper money here in the United States. You see, an appeals court judge has just ruled that our current system of
paper money discriminates against blind people.
Whether you agree with the lawsuit brought by the
American Council of the Blind
(ACB), or not, the legal arguments made in the suit requesting change are valid.
Of all the major developed countries in the world, only the U.S. has nothing in place to make possible tactile discrimination between different denominations of our paper money. So, the Treasury Department cannot argue that it isn’t possible. Nor, due to the constantly reworkings almost every bill has received in recent years by this department, can they argue undue hardship. And, the legal argument striking down the Treasury Department’s defense of the current system due to the adaptability of most blind people having a way to cope are just as strong.
From the Associated Press article:
“The court ruled 2-1 that such adaptations were insufficient. The government might as well argue that, since handicapped people can crawl on all fours or ask for help from strangers, there's no need to make buildings wheelchair accessible, the court said.”
I may be wrong, but I expect that the Treasury Department will appeal the ruling to the Supreme Court. Even if they choose to accept the current ruling as the right thing to do and quit appealing, change will be slow in coming. Still, I do believe change is in the air.
Labels:
American Council of the Blind,
Blindness,
Legal,
Money,
News
Friday, May 16, 2008
Glove converts sign language into speech
I read a tech article in Barron’s earlier this week in which some folks, recognized for their ability to see current trends and how they might evolve to meet future needs, made a list of their
Top 10 tech trends for the future.
In that list, there seems to be a general acknowledgement that the cell phone will continue to evolve into something much more complex and useful than its current incarnation. However, one possible application of the cell phone that wasn’t mentioned was a potential for it to act as the translator for a
Glove that converts sign language into sound.
The glove will allow somebody who communicates using sign language the ability to communicate with almost anybody and do so without the need for an interpreter. When it is completely developed, somebody who uses sign language will be able to wear this glove and sign as usual. Then, the programmed glove will recognize the fingers and hand positions and will then send the associated word or phrase to the user’s cell phone. The cell phone, running an off-the-shelf text-to-speech program, will then speak out the word or phrase.
The glove, a product of a team of graduate students at Carnegie Mellon University, is still undergoing refinement and is working on the learning curve. It knows only 32 words so far. Additionally, it seems like the team has taken some liberties and created some signs to work through the process, instead of using strictly ASL. However, of the 26 letters of the American Language alphabet, the glove has learned 15 of them.
Granted, it is not perfect, nor is it a completed, working model yet, but that latter outcome certainly seems like a given.
I believe the students’ hearts are definitely in the right place, integrating existing text messaging and text-to-speech applications with their vision. I hope they don’t catch any flak from deaf/hearing impaired purists for inventing a basic primitive language instead of striving to make it work with existing ASL. If they arise, I hope the critics aren’t too harsh, though, as innovation is where great projects begin.
This is just another embodiment of people thinking outside the conventional limitations to imagine what is possible, instead of resigning themselves to what is not.
Top 10 tech trends for the future.
In that list, there seems to be a general acknowledgement that the cell phone will continue to evolve into something much more complex and useful than its current incarnation. However, one possible application of the cell phone that wasn’t mentioned was a potential for it to act as the translator for a
Glove that converts sign language into sound.
The glove will allow somebody who communicates using sign language the ability to communicate with almost anybody and do so without the need for an interpreter. When it is completely developed, somebody who uses sign language will be able to wear this glove and sign as usual. Then, the programmed glove will recognize the fingers and hand positions and will then send the associated word or phrase to the user’s cell phone. The cell phone, running an off-the-shelf text-to-speech program, will then speak out the word or phrase.
The glove, a product of a team of graduate students at Carnegie Mellon University, is still undergoing refinement and is working on the learning curve. It knows only 32 words so far. Additionally, it seems like the team has taken some liberties and created some signs to work through the process, instead of using strictly ASL. However, of the 26 letters of the American Language alphabet, the glove has learned 15 of them.
Granted, it is not perfect, nor is it a completed, working model yet, but that latter outcome certainly seems like a given.
I believe the students’ hearts are definitely in the right place, integrating existing text messaging and text-to-speech applications with their vision. I hope they don’t catch any flak from deaf/hearing impaired purists for inventing a basic primitive language instead of striving to make it work with existing ASL. If they arise, I hope the critics aren’t too harsh, though, as innovation is where great projects begin.
This is just another embodiment of people thinking outside the conventional limitations to imagine what is possible, instead of resigning themselves to what is not.
Labels:
Assistive Technology,
Cell phones,
Deafness,
Mobile phones,
News,
Text To Speech,
Trends
Tuesday, May 13, 2008
Changing colas and switching relationships: Reflections on synthesized speech
I’ve written previously that I’ve been using JAWS for more than 10 years. In those years, I’ve also grown in the amount of time I spend on the computer. I’ve gone from 2-3 hours at the computer each day to probably spending 8 or more hours a day. Today, I would like to offer this experience as a qualifier to validate what follows as the perspective of a well experienced veteran of assistive technology and synthesized speech. Here goes…
Although I haven’t used Job Access With Speech, better known in assistive tech (AT) circles as JAWS, since its inception, I have been around for quite a bit of transition in the life cycle of this powerful tool. When I began using JAWS, it came on four floppy discs (yes, those 1.4 MB storage devices that wouldn’t even hold one song in mp3 format) and one of those disks was used only for the installation and removal of authorization key. It was also still being made by the company known as Henter Joyce. I marveled when, I believe it was, JAWS 3.0, or maybe 3.5, arrived on a neatly packaged CD-ROM. I just thought this was AT taking a big step forward and catching up with the rest of this modern world.
I also recall that this first distribution of JAWS on CD was my first introduction to the Eloquence speech engine. I have grown to use this as my default speech synthesizer and prefer it over all others that I’ve tried. I think it offers the best sound clarity and understandable speech. I do understand that not everybody feels this way, but this is my opinion.
However, before the switch to Eloquence, I had been using DECTalk Express, an external, device that had its own volume dial and needed to be plugged into one of the computer’s com ports. I also remember feeling at that time that a shift in reliance from a hardware-based speech synthesizer to one that was just another piece of software was something near blasphemy.
The DECTalk Express unit was small and sat nicely atop the rear half of the top of my CPU on the folding table turned computer desk, which was the Wal-mart version of the kinds used universally in school cafeterias. I had selected the Express model because it could be unplugged from my pc and plugged into any other computer that I needed to use. That portability feature was one of the selling points on choosing DECTalk over the other synthesizers I had been shown.
During my tech evaluation at the Assistive Technology Unit in Austin, I was shown JAWS and Window Eyes for screen reader choices and, obviously for what I wanted to do on the computer, I felt JAWS would be the best option. I was then shown three or four speech synthesizers. By far, I felt that DECTalk voices were the best. I was very sold on this selection and actually felt a little snobbish glee inside as a result of the inference the evaluator had made, when I said that I liked DECTalk best, was that this was the Cadillac of synths.
I used the DEC Express, as I came to call the talking box, for a couple of years without problems. I would adjust the volume up and down depending on the noise around me. That took place several times a day. Even though the new company Freedom Scientific assured me that I didn’t need to, I still chose to use it after JAWS introduced Eloquence. Sure, I installed Eloquence and tried it out for pure curiosity reasons. But that new technology sounded plastic and just didn’t have the warmth and realism that the DEC Express, with its nice collection of voices, gave me.
Then one day, I noticed the DEC Express began to emit a loud, scratchy noise whenever I turned the volume knob. That was bad, but not quite a killer for me. I just had to either get used to it or leave the volume at one level. Over the period of a couple of months, this degenerated to an untenable situation and I opted to try that new-fangled JAWS’ software synthesizer.
“Okay,” I thought, this isn’t too bad once you get used to it. But, I still preferred the soothing tones of DEC Express.
The final blow came when I got a new computer and the disk with the drivers for the DEC Express was hopelessly lost. With no other alternative, I made Eloquence my default speech engine on that new machine. Periodically, I would get this longing for my old friend DEC Express and fiddle around with it, trying to make it work. I even tried to find some resources online for the manufacturer, Digital Equipment Company. The company had been sold and finding online support in the late ‘90s was difficult for me to locate. All of this effort seemed to be of no avail, though, and I mostly abandoned my old friend, completely trading his companionship for that new kid on the block.
I kept that neat, little box atop the pc for a long time, thinking I would still find something that could fix it, but eventually began to store it on one of my bookshelves with other assorted, still viable tech equipment. Down the line, after I got married, we moved and it went from being stored on the bookshelf to being in a box that never got unpacked. From there, I think it went out with some other discarded material I didn’t have need for nor the room to store any longer.
I continued to use Eloquence over these years while JAWS has advanced to version 9. Along the way, I’ve watched the screen reader and assistive technology fields for news and breakthroughs in technology, mostly forgetting about my old friend DEC Express.
Then, last year, I tried out Serotek’s System Access To Go,which was good, but the speech didn’t work great for me. I was too used to the voices on Eloquence that the SAToGo speech engine didn’t even sound right. It sounded too mechanical. I looked at the overview of SAToGo to find what type of space alien voice box it was using for a speech engine.
What’s that? Did I read that correctly? SAToGo uses DECTalk. And, yes, I am the one who said this sounded too mechanical. Was I the one who is now blathering blasphemy?
Since finding that out, I felt like somebody who had been a die-hard drinker of one kind of cola, and when it became unavailable, switched to the other one and stuck with it. Now, years later, I’ve become accustomed to that other cola and think the first one just doesn’t taste right.
I know that the feelings I had for the DEC Express were warm, genuine, and right for where I was at the time. Since letting go and moving on to Eloquence, I’ve grown very fond of it and definitely prefer it over any other synth I’ve heard. Its sort of like a relationship. You get a lot out of it because you rely on that other part of you for so much, but when looking back at it from the safety and security of the new relationship, that first one just isn’t as magical as it once was.
Make what you want from this examination of my transition from DECTalk to Eloquence. It is still a little puzzling to me how my loyalty has shifted so much to land where it is today, but it is what it is.
Although I haven’t used Job Access With Speech, better known in assistive tech (AT) circles as JAWS, since its inception, I have been around for quite a bit of transition in the life cycle of this powerful tool. When I began using JAWS, it came on four floppy discs (yes, those 1.4 MB storage devices that wouldn’t even hold one song in mp3 format) and one of those disks was used only for the installation and removal of authorization key. It was also still being made by the company known as Henter Joyce. I marveled when, I believe it was, JAWS 3.0, or maybe 3.5, arrived on a neatly packaged CD-ROM. I just thought this was AT taking a big step forward and catching up with the rest of this modern world.
I also recall that this first distribution of JAWS on CD was my first introduction to the Eloquence speech engine. I have grown to use this as my default speech synthesizer and prefer it over all others that I’ve tried. I think it offers the best sound clarity and understandable speech. I do understand that not everybody feels this way, but this is my opinion.
However, before the switch to Eloquence, I had been using DECTalk Express, an external, device that had its own volume dial and needed to be plugged into one of the computer’s com ports. I also remember feeling at that time that a shift in reliance from a hardware-based speech synthesizer to one that was just another piece of software was something near blasphemy.
The DECTalk Express unit was small and sat nicely atop the rear half of the top of my CPU on the folding table turned computer desk, which was the Wal-mart version of the kinds used universally in school cafeterias. I had selected the Express model because it could be unplugged from my pc and plugged into any other computer that I needed to use. That portability feature was one of the selling points on choosing DECTalk over the other synthesizers I had been shown.
During my tech evaluation at the Assistive Technology Unit in Austin, I was shown JAWS and Window Eyes for screen reader choices and, obviously for what I wanted to do on the computer, I felt JAWS would be the best option. I was then shown three or four speech synthesizers. By far, I felt that DECTalk voices were the best. I was very sold on this selection and actually felt a little snobbish glee inside as a result of the inference the evaluator had made, when I said that I liked DECTalk best, was that this was the Cadillac of synths.
I used the DEC Express, as I came to call the talking box, for a couple of years without problems. I would adjust the volume up and down depending on the noise around me. That took place several times a day. Even though the new company Freedom Scientific assured me that I didn’t need to, I still chose to use it after JAWS introduced Eloquence. Sure, I installed Eloquence and tried it out for pure curiosity reasons. But that new technology sounded plastic and just didn’t have the warmth and realism that the DEC Express, with its nice collection of voices, gave me.
Then one day, I noticed the DEC Express began to emit a loud, scratchy noise whenever I turned the volume knob. That was bad, but not quite a killer for me. I just had to either get used to it or leave the volume at one level. Over the period of a couple of months, this degenerated to an untenable situation and I opted to try that new-fangled JAWS’ software synthesizer.
“Okay,” I thought, this isn’t too bad once you get used to it. But, I still preferred the soothing tones of DEC Express.
The final blow came when I got a new computer and the disk with the drivers for the DEC Express was hopelessly lost. With no other alternative, I made Eloquence my default speech engine on that new machine. Periodically, I would get this longing for my old friend DEC Express and fiddle around with it, trying to make it work. I even tried to find some resources online for the manufacturer, Digital Equipment Company. The company had been sold and finding online support in the late ‘90s was difficult for me to locate. All of this effort seemed to be of no avail, though, and I mostly abandoned my old friend, completely trading his companionship for that new kid on the block.
I kept that neat, little box atop the pc for a long time, thinking I would still find something that could fix it, but eventually began to store it on one of my bookshelves with other assorted, still viable tech equipment. Down the line, after I got married, we moved and it went from being stored on the bookshelf to being in a box that never got unpacked. From there, I think it went out with some other discarded material I didn’t have need for nor the room to store any longer.
I continued to use Eloquence over these years while JAWS has advanced to version 9. Along the way, I’ve watched the screen reader and assistive technology fields for news and breakthroughs in technology, mostly forgetting about my old friend DEC Express.
Then, last year, I tried out Serotek’s System Access To Go,which was good, but the speech didn’t work great for me. I was too used to the voices on Eloquence that the SAToGo speech engine didn’t even sound right. It sounded too mechanical. I looked at the overview of SAToGo to find what type of space alien voice box it was using for a speech engine.
What’s that? Did I read that correctly? SAToGo uses DECTalk. And, yes, I am the one who said this sounded too mechanical. Was I the one who is now blathering blasphemy?
Since finding that out, I felt like somebody who had been a die-hard drinker of one kind of cola, and when it became unavailable, switched to the other one and stuck with it. Now, years later, I’ve become accustomed to that other cola and think the first one just doesn’t taste right.
I know that the feelings I had for the DEC Express were warm, genuine, and right for where I was at the time. Since letting go and moving on to Eloquence, I’ve grown very fond of it and definitely prefer it over any other synth I’ve heard. Its sort of like a relationship. You get a lot out of it because you rely on that other part of you for so much, but when looking back at it from the safety and security of the new relationship, that first one just isn’t as magical as it once was.
Make what you want from this examination of my transition from DECTalk to Eloquence. It is still a little puzzling to me how my loyalty has shifted so much to land where it is today, but it is what it is.
Thursday, May 08, 2008
Transitioning from high school to college; the LAMP model
I want to thank Tanesha Antoine the Special Populations Coordinator at
San Jacinto College (South Campus)
for inviting me to meet yesterday with a group of
Clear Creek ISD
high school students who are blind and visually impaired, and are considering college after graduation.
So the group could understand what qualifies me to speak with them, I began by giving them some background on myself. I described my personal history before blindness, the adventitious blindness, and my decision to go to college as a first step in doing something meaningful.
I emphasized that I had more than 13 years of great working experience that nobody could ever take away from me, but that this was not going to be enough for me to be seriously looked at for employment. I needed something to go with this experience, something else that, once earned, could not be taken away. That was an education.
I highlighted that my first contact at the college was at the DSO at the community college near my parents’ home, where I lived after the accident. I also highlighted to them how soon it became clear how important that office was going to be in my educational experience.
I continued by talking about my educational journey that took me On to the upper level university path and had me picking up two necessary classes, right there at San Jac South. I filled in about my three years as a Graduate Assistant experience in the university DSO and how it taught me so much more than just the student experience of disability services, which was augmented by the time I spent as a DSO Coordinator. I let them know I knew the experience from both sides of the disability service window-- both as a student receiving services and as a coordinator providing these services.
I strongly emphasized the importance of understanding the difference between students receiving services at the high school level and when going to college, outlining the laws governing these two domains. I really wanted them to understand how the onus was going to be on them for gaining accommodations when in college.
I’m one who finds acronyms a simple method of remembering more broad concepts. I also find that these make it an easy way to pass along information when speaking as well. After writing down the basics of what I wanted to discuss, I looked at the central idea of each point and created the LAMP model. Below are the basics of that model.
Limitations - Understand which limitations you truly have and do not set false ones for yourself.
If you cannot see to drive, then it is a true limitation that you cannot drive. However, do not use this as a cop-out and say, “I can’t drive, so I can’t get somewhere.” I illustrated this point by explaining the one-hour drive I had made that morning to meet with them. As another example, I pointed out that just because you’re unable to see doesn’t mean you can’t do algebra.
Advocacy - Self-advocacy is one of your greatest tools.
Nobody can speak up for you better than you about what resources you have at your disposal and what your accommodation needs are. Resources are not only the adaptive or assistive technology you have, but are also your skills, such as Braille or computer access with assistive technology, as well as your network of contacts. You know what works best for you and it is up to you to communicate these to your DSO Coordinator, professors, and classmates.
Meeting - Meet with your professors as soon as possible to discuss your accommodation needs.
This is critical for both the student and the professor. Meet with them in person, over the phone, or by email, but make it a point to meet with your professors as soon as possible. Do this before the semester starts, if at all possible. If you get that nasty old professor “Staff,” or otherwise do not know who your professor will be before the first class day, then by all means, stay after class that day and meet with him/her. Most professors are in this profession because they want to teach, but don’t always know what you need to learn. If you make the effort to learn and show them what you need, then most often you will develop a good working relationship with the professor. I emphasized that there will be some who might resist some specific accommodations, like recording lectures, but stand firm and call on the DSO Coordinator as your facilitator.
Planning - Planning to be successful means you must be successful at planning.
Planning has to do with everything from O&M to books, to how you’re going to address things like notetaking, projects, and any specifics of the class. You will need to learn the routes to class before the semester starts so that you can be at class on time starting that very first week. Communicate with your professors before the semester starts to learn which books are required so that you can, in turn, coordinate your needs with the DSO Coordinator so that accessible formats of the books can be gathered.
There was a good Q&A session following my presentation where a few final items were discussed. These subjects varied, but included the importance of registering with the DSO, advocating for your technology needs with the Texas Department of Assistive and Rehabilitive Services, owning your assistive technology versus using loaners provided by the DSO, and some aspects of the Criss Cole Rehabilitation Center, located in Austin. During this time, I also got to beam when demonstrating my Victor Reader Stream and showing some of the great features of this powerful piece of assistive technology.
San Jacinto College (South Campus)
for inviting me to meet yesterday with a group of
Clear Creek ISD
high school students who are blind and visually impaired, and are considering college after graduation.
So the group could understand what qualifies me to speak with them, I began by giving them some background on myself. I described my personal history before blindness, the adventitious blindness, and my decision to go to college as a first step in doing something meaningful.
I emphasized that I had more than 13 years of great working experience that nobody could ever take away from me, but that this was not going to be enough for me to be seriously looked at for employment. I needed something to go with this experience, something else that, once earned, could not be taken away. That was an education.
I highlighted that my first contact at the college was at the DSO at the community college near my parents’ home, where I lived after the accident. I also highlighted to them how soon it became clear how important that office was going to be in my educational experience.
I continued by talking about my educational journey that took me On to the upper level university path and had me picking up two necessary classes, right there at San Jac South. I filled in about my three years as a Graduate Assistant experience in the university DSO and how it taught me so much more than just the student experience of disability services, which was augmented by the time I spent as a DSO Coordinator. I let them know I knew the experience from both sides of the disability service window-- both as a student receiving services and as a coordinator providing these services.
I strongly emphasized the importance of understanding the difference between students receiving services at the high school level and when going to college, outlining the laws governing these two domains. I really wanted them to understand how the onus was going to be on them for gaining accommodations when in college.
I’m one who finds acronyms a simple method of remembering more broad concepts. I also find that these make it an easy way to pass along information when speaking as well. After writing down the basics of what I wanted to discuss, I looked at the central idea of each point and created the LAMP model. Below are the basics of that model.
Limitations - Understand which limitations you truly have and do not set false ones for yourself.
If you cannot see to drive, then it is a true limitation that you cannot drive. However, do not use this as a cop-out and say, “I can’t drive, so I can’t get somewhere.” I illustrated this point by explaining the one-hour drive I had made that morning to meet with them. As another example, I pointed out that just because you’re unable to see doesn’t mean you can’t do algebra.
Advocacy - Self-advocacy is one of your greatest tools.
Nobody can speak up for you better than you about what resources you have at your disposal and what your accommodation needs are. Resources are not only the adaptive or assistive technology you have, but are also your skills, such as Braille or computer access with assistive technology, as well as your network of contacts. You know what works best for you and it is up to you to communicate these to your DSO Coordinator, professors, and classmates.
Meeting - Meet with your professors as soon as possible to discuss your accommodation needs.
This is critical for both the student and the professor. Meet with them in person, over the phone, or by email, but make it a point to meet with your professors as soon as possible. Do this before the semester starts, if at all possible. If you get that nasty old professor “Staff,” or otherwise do not know who your professor will be before the first class day, then by all means, stay after class that day and meet with him/her. Most professors are in this profession because they want to teach, but don’t always know what you need to learn. If you make the effort to learn and show them what you need, then most often you will develop a good working relationship with the professor. I emphasized that there will be some who might resist some specific accommodations, like recording lectures, but stand firm and call on the DSO Coordinator as your facilitator.
Planning - Planning to be successful means you must be successful at planning.
Planning has to do with everything from O&M to books, to how you’re going to address things like notetaking, projects, and any specifics of the class. You will need to learn the routes to class before the semester starts so that you can be at class on time starting that very first week. Communicate with your professors before the semester starts to learn which books are required so that you can, in turn, coordinate your needs with the DSO Coordinator so that accessible formats of the books can be gathered.
There was a good Q&A session following my presentation where a few final items were discussed. These subjects varied, but included the importance of registering with the DSO, advocating for your technology needs with the Texas Department of Assistive and Rehabilitive Services, owning your assistive technology versus using loaners provided by the DSO, and some aspects of the Criss Cole Rehabilitation Center, located in Austin. During this time, I also got to beam when demonstrating my Victor Reader Stream and showing some of the great features of this powerful piece of assistive technology.
Tuesday, May 06, 2008
BlogCarnival.com says they don't allow blind people and they don't care if that's offensive
I’m putting on my advocacy hat today. I’m steamed, so that cap might be riding a little cockeyed right now.
Any regular reader knows that a royal thorn in my side is inaccessible CAPTCHA. That is initially what this post is about, but moreover, it is about web sites that turn their nose at providing accessibility.
I recently posted about the
Assistive Technology Blog Carnival,
Which at first used a widget to allow user submissions via a site called
Blog Carnival.
(That site hereafter is referred to as BC).
However, it was soon discovered that to submit a post via the BC site, users were required to complete a CAPTCHA with no accessibility feature. Lon, the host of the Assistive Technology Carnival, promptly removed that widget as soon as he learned about the inaccessibility of the BC site. Users can still submit to the carnival through comments on the AT Carnival site linked above, or via an email to Lon, whose email address is posted on the page.
Lon, myself and a few others have taken the issue up with the BC site and written letters asking for the site to consider using accessible CAPTCHA technology which would allow blind users to access their site. These letters have included suggestions of accessible CAPTCHA solutions such as
RECAPTCHA.
Unfortunately, nobody has received a response. Until today, when I read that T. Reid, of the
Reid My Mind blog
Wrote about the response he got from BC.
I’ve taken the liberty of copying the text of the infuriatingly smug letter from a woman named Denise at the BC site. (See below).
After reading her letter, please make a choice and do something proactive on behalf of people with disabilities. Write an email to the BC site and express concern for providing an accessible web environment for all. The email address for contacting Denise is:
Support@BlogCarnival.com
If you’ve written a letter to BC already, then write again. Let them hear from you that the stance BC is officially taking is just wrong.
I coined a phrase several years ago: “If you’re not including somebody, then you are excluding them.” From the letter Denise wrote, it is very ovvious that BC is making a choice to exclude members of the blind community.
Here is the letter he received:
From: "BlogCarnival Support"
Sent: Sunday, May 04, 2008 3:12 PM
Subject: Re: Blog Carnival Refuses to admit the blind community
Thomas -
Thanks for your email about CAPTCHA and about the visually impaired community.
Unfortunately, Blog Carnival doesn’t have plans at this time to implement CAPTCHA. We are considering redesigns of the site, but we do not have a timeline for that. We will keep CAPTCHA in mind as we look at redesign options.
Good luck with your efforts to make the web a better place!
Yours,
Denise
Support@BlogCarnival.com
His response was:
—– Original Message —–
To: "BlogCarnival Support"
Sent: Sunday, May 04, 2008 6:34 PM
Subject: Re: Blog Carnival Refuses to admit the blind community
Denise,
Thanks for the response.
You said,"Unfortunately, Blog Carnival doesn’t have plans at this time to implement CAPTCHA."
By this I am assuming you mean no plans to introduce alternative CAPTCHA solutions. That’s really too bad. Blog Carnival is missing out on an opportunity to make a statement not only to those with visual impairments, but the entire disability community. The statement, "Your participation is important to us."
I guess there is the possibility of an alternative statement based on the future redesign.
Based on this response, I personally will forgo from participating in carnivals that use BC. I will continue to make others aware of the official statement.
Feel free to let me know if there are any changes in Blog Carnival’s position on visual only CAPTCHA.
Respectfully,
T.Reid
What are you waiting for? Go send that email to Denise at BlogCarnival.com!
Any regular reader knows that a royal thorn in my side is inaccessible CAPTCHA. That is initially what this post is about, but moreover, it is about web sites that turn their nose at providing accessibility.
I recently posted about the
Assistive Technology Blog Carnival,
Which at first used a widget to allow user submissions via a site called
Blog Carnival.
(That site hereafter is referred to as BC).
However, it was soon discovered that to submit a post via the BC site, users were required to complete a CAPTCHA with no accessibility feature. Lon, the host of the Assistive Technology Carnival, promptly removed that widget as soon as he learned about the inaccessibility of the BC site. Users can still submit to the carnival through comments on the AT Carnival site linked above, or via an email to Lon, whose email address is posted on the page.
Lon, myself and a few others have taken the issue up with the BC site and written letters asking for the site to consider using accessible CAPTCHA technology which would allow blind users to access their site. These letters have included suggestions of accessible CAPTCHA solutions such as
RECAPTCHA.
Unfortunately, nobody has received a response. Until today, when I read that T. Reid, of the
Reid My Mind blog
Wrote about the response he got from BC.
I’ve taken the liberty of copying the text of the infuriatingly smug letter from a woman named Denise at the BC site. (See below).
After reading her letter, please make a choice and do something proactive on behalf of people with disabilities. Write an email to the BC site and express concern for providing an accessible web environment for all. The email address for contacting Denise is:
Support@BlogCarnival.com
If you’ve written a letter to BC already, then write again. Let them hear from you that the stance BC is officially taking is just wrong.
I coined a phrase several years ago: “If you’re not including somebody, then you are excluding them.” From the letter Denise wrote, it is very ovvious that BC is making a choice to exclude members of the blind community.
Here is the letter he received:
From: "BlogCarnival Support"
Sent: Sunday, May 04, 2008 3:12 PM
Subject: Re: Blog Carnival Refuses to admit the blind community
Thomas -
Thanks for your email about CAPTCHA and about the visually impaired community.
Unfortunately, Blog Carnival doesn’t have plans at this time to implement CAPTCHA. We are considering redesigns of the site, but we do not have a timeline for that. We will keep CAPTCHA in mind as we look at redesign options.
Good luck with your efforts to make the web a better place!
Yours,
Denise
Support@BlogCarnival.com
His response was:
—– Original Message —–
To: "BlogCarnival Support"
Sent: Sunday, May 04, 2008 6:34 PM
Subject: Re: Blog Carnival Refuses to admit the blind community
Denise,
Thanks for the response.
You said,"Unfortunately, Blog Carnival doesn’t have plans at this time to implement CAPTCHA."
By this I am assuming you mean no plans to introduce alternative CAPTCHA solutions. That’s really too bad. Blog Carnival is missing out on an opportunity to make a statement not only to those with visual impairments, but the entire disability community. The statement, "Your participation is important to us."
I guess there is the possibility of an alternative statement based on the future redesign.
Based on this response, I personally will forgo from participating in carnivals that use BC. I will continue to make others aware of the official statement.
Feel free to let me know if there are any changes in Blog Carnival’s position on visual only CAPTCHA.
Respectfully,
T.Reid
What are you waiting for? Go send that email to Denise at BlogCarnival.com!
Monday, May 05, 2008
ATHEN seeking survey input about Information Technology and accessibility
The information below is initially targeted to those of you who work in the DSO at postsecondary institutions, but will also involve other departments on your campus. It is interdepartmental, but the results will be most impacting to your department, so it is up to you to do the legwork and bring the other departments onboard.
The Access Technology Higher Education Network
(ATHEN) is conducting a survey regarding higher education institutions' practices, procedures, and policies for addressing information technology accessibility needs of students.
The deadline for participating in the survey is Friday, May 16.
Results will be published in the upcoming ATHEN e-Journal, and will be announced first in a pre-conference session at the AHEAD Conference 0n July 14 in Reno, NV. The session is titled "Creating Intersections that Connect Students with Disabilities and High-Tech Careers". This is an all-dayCapacity Building Institute, and there will be plenty of opportunities to discuss the implications of the survey results.
For details of the ATHEN survey, its six component sections, and which departmental official they are seeking to complete each section, click the link above to go to the ATHEN blog site.
However, if you are already familiar with who is best suited to complete each section on your campus, then the survey is located at:
http://www.athenpro.org/survey/
The Access Technology Higher Education Network
(ATHEN) is conducting a survey regarding higher education institutions' practices, procedures, and policies for addressing information technology accessibility needs of students.
The deadline for participating in the survey is Friday, May 16.
Results will be published in the upcoming ATHEN e-Journal, and will be announced first in a pre-conference session at the AHEAD Conference 0n July 14 in Reno, NV. The session is titled "Creating Intersections that Connect Students with Disabilities and High-Tech Careers". This is an all-dayCapacity Building Institute, and there will be plenty of opportunities to discuss the implications of the survey results.
For details of the ATHEN survey, its six component sections, and which departmental official they are seeking to complete each section, click the link above to go to the ATHEN blog site.
However, if you are already familiar with who is best suited to complete each section on your campus, then the survey is located at:
http://www.athenpro.org/survey/
Thursday, May 01, 2008
Man's finger is regenerated, including the nail and fingerprint
In February of last year,I reported on a substance manufactured from an extract of dried pig’s bladder that was used to
Regenerate a man’s fingertip.
Here’s an update on that item.
Fox News shares an
Associated Press article
Discussing how a man’s fingertip was regrown in only four weeks, to include fingernail and fingerprint, with the aid of this dried porcine bladder extract (which the man calls “pixie dust.”)
Now, I don't know if the man in the Fox News story is the same man mentioned in the article I had previously reported about. The Fox News report does include the extent to which the finger regenerated, which is fantastic to even conceive and why I'm writing about it today.
What else can I say? Science continues to impress and amaze me.
Regenerate a man’s fingertip.
Here’s an update on that item.
Fox News shares an
Associated Press article
Discussing how a man’s fingertip was regrown in only four weeks, to include fingernail and fingerprint, with the aid of this dried porcine bladder extract (which the man calls “pixie dust.”)
Now, I don't know if the man in the Fox News story is the same man mentioned in the article I had previously reported about. The Fox News report does include the extent to which the finger regenerated, which is fantastic to even conceive and why I'm writing about it today.
What else can I say? Science continues to impress and amaze me.
The iPhone now has some accessibility...sort of
Well, the iPhone is coming along in providing accessibility to customers with disabilities.
Maybe that should actually read that AT&T, the exclusive provider of voice and data plans for the popular Apple product, has finally made a jump to address some specific
accessibility concerns on the iPhone.
While the news article linked above is an umbrella announcement about accessibility, what it offers is more specifically AT&T providing a plan for internet and messaging for iPhone customers who are "deaf, hard of hearing, have a speech disability and/or hearing loss."
This really is great news for this group of customers and I applaud AT&T for doing the right thing. After all, why should customers have to pay for a voice plan that is of limited use or value to them?
The text Accessibility Plan for iPhone
is a $40 a month flat-rate feature and will allow customers who have a qualifying disability to have unlimited access to web browsing, email, and text messaging.
But didn’t AT&T promise a plan like this some time ago, like, um, back in December? I suppose almost six months late is better than never.
It appears that AT&T used that announcement to demonstrate their sincerity in making the iPhone accessible to even more customers with disabilities. That announcement also included information about the use of a mobile magnifier to help people with limited vision see their screens. This would be good news, if true, and another great step forward in providing accessibility.
But, I have to ask what magnifier program that might be? Is this the same Mobile Magnifier by Code Factory that AT&T has been selling for use on phones running the Windows Mobile or Symbian operating systems? Is that same application now Apple compatible?
Finally, the article also said that an option will also be for the Mobile Speak screen reader (also manufactured by Code Factory) to announce the menu options. This one has me scratching my head. Unless there is some voice command aspect, how will a person who can not touch the correct spot on the touch screen make the Mobile Speak software work? Unless there has been some change in the physical build of the iPhone, there are no buttons on it and the sole input is via the touch screen, which, without modification, would make navigation by the blind completely impossible.
And, I have to ask again, can the Mobile Speak program now run on the Apple operating system?
Maybe I just missed the press release announcing Code Factory products now working across the competing Windows and Apple operating systems.
Or maybe not.
Check out the official Code Factory list of supported products.
Good work on getting some accessibility options rolling, AT&T, but I think maybe there’s a hole in that umbrella.
Maybe that should actually read that AT&T, the exclusive provider of voice and data plans for the popular Apple product, has finally made a jump to address some specific
accessibility concerns on the iPhone.
While the news article linked above is an umbrella announcement about accessibility, what it offers is more specifically AT&T providing a plan for internet and messaging for iPhone customers who are "deaf, hard of hearing, have a speech disability and/or hearing loss."
This really is great news for this group of customers and I applaud AT&T for doing the right thing. After all, why should customers have to pay for a voice plan that is of limited use or value to them?
The text Accessibility Plan for iPhone
is a $40 a month flat-rate feature and will allow customers who have a qualifying disability to have unlimited access to web browsing, email, and text messaging.
But didn’t AT&T promise a plan like this some time ago, like, um, back in December? I suppose almost six months late is better than never.
It appears that AT&T used that announcement to demonstrate their sincerity in making the iPhone accessible to even more customers with disabilities. That announcement also included information about the use of a mobile magnifier to help people with limited vision see their screens. This would be good news, if true, and another great step forward in providing accessibility.
But, I have to ask what magnifier program that might be? Is this the same Mobile Magnifier by Code Factory that AT&T has been selling for use on phones running the Windows Mobile or Symbian operating systems? Is that same application now Apple compatible?
Finally, the article also said that an option will also be for the Mobile Speak screen reader (also manufactured by Code Factory) to announce the menu options. This one has me scratching my head. Unless there is some voice command aspect, how will a person who can not touch the correct spot on the touch screen make the Mobile Speak software work? Unless there has been some change in the physical build of the iPhone, there are no buttons on it and the sole input is via the touch screen, which, without modification, would make navigation by the blind completely impossible.
And, I have to ask again, can the Mobile Speak program now run on the Apple operating system?
Maybe I just missed the press release announcing Code Factory products now working across the competing Windows and Apple operating systems.
Or maybe not.
Check out the official Code Factory list of supported products.
Good work on getting some accessibility options rolling, AT&T, but I think maybe there’s a hole in that umbrella.
Labels:
Assistive Technology,
ATT,
Blindness,
Cell phones,
Deafness,
iPhone,
Mobile phones,
News
Subscribe to:
Posts (Atom)