So, our air conditioner has been out since Monday. I don't think it's anything big; the repair person is coming today, and while any HVAC repair is not cheap, it shouldn't be too outrageously expensive. I think it just needs more coolant.
Anyhow, the lack of air conditioning has made it pretty hot in the house. I've opened the windows at night, to let the cool air in, and by morning the house is pretty cool. Yesterday morning it was 77; today it's 80. But this has made me realize that if we had a more efficient method of exchanging the air inside our house with the air outdoors, we could use a lot less energy for air conditioning.
Yes, there exist whole house fans, and that would help, but that's not all that I'm talking about. I think it would be useful to have an intelligent house with a hybrid HVAC system. Here's my idea.
You would provide a range of temperatures that you would consider acceptable indoor temperatures, let's say 68-80ºF, and an ideal temperature, let's say 78. Your house would shut all the windows and turn on the air conditioner, targeting your ideal temperature, when the outside temperature was too high to maintain the house within your range. Otherwise, the house would open windows, exchange the indoor air with outdoor air using your whole-house fan, close and shade windows on the side of the house where the sun is beating down, etc.
And the system could do the same sort of thing in the fall and winter: shut all the windows and turn on the heat at night, and then harness the warmth of the air outside during the day when it could.
If we had a system like that, I think we'd need to run the air conditioner only for a couple of hours during the day. But as it stands, even though it's 61º outside right now, the temperature in the house is 80, and if our air conditioning system were working, it would be running right now. What a waste!
Wednesday, May 30, 2007
Sunday, May 27, 2007
In Which My Ire Is Ignited by an Anti-Obesity Activist
I recently saw a blog entry at The Zero Boss with a video that got me really angry. It was a Fox News clip discussing the alleged obesity of American Idol star Jordin Sparks. Rail-thin MeMe Roth, a representative of the National Action Against Obesity organization, described Sparks as a picture of poor health, and decried her fame as a bad example for children.
I am totally out of the loop and I had never heard of Jordin Sparks. So I looked her up on the internet, and saw that she is an attractive young lady of large stature. She looks like she is at least six feet tall, and she is curvaceous. But Ms. Roth can't appreciate the aesthetics of Jordin Sparks, instead prognosticating that she's a walking case of diabetes, heart disease, and cancer just waiting to happen. When asked if Sparks was obese, Roth demurred that was a question that should be settled between Sparks and her doctor, which meant "yes, but I'm too much of a coward to say it."
MeMe Roth came off as an aptly-named, self-centered, condescending prig in the interview segment, and I really think that's too bad. The goals of her organization, as outlined in her (otherwise ridiculous) letter reproduced in this blog entry, seem like noble, lofty goals. I'm all for eliminating junk food from schools, getting rid of unhealthy food additives, and stopping obesity from being handed down from one generation to the next. Unfortunately, her smug attitude undermines her organization's effectiveness.
If she had any iota of compassion in her body, Ms. Roth would realize that obesity is not something that can be shamed out of people. Obesity is a result of genetics, food choices, and exercise habits! If she truly cared about stopping obesity instead of feeding her own superiority complex, then she would exercise compassion for obese people and find constructive ways to help them make different choices when it comes to food and exercise.
I think that she must have no experience with excessive weight. She's never had to rethink all her assumptions about food, or change her eating habits. She's never been "good" for a week, only to get on the scale and find that she's gained five pounds. She's never been snubbed, discriminated against, or ridiculed for her size. She's never been told that the ugliness of her body is due to a flaw in her own character. She's never had trouble finding clothes that couldn't be used as tents for thin people. She's never experienced the humiliation of spilling over into somebody else's seat on an airplane. She's never stayed home out of self-consciousness and fear that people will stare. She's never struggled to exercise because the weight she's accumulated makes her knees creak, or makes it difficult to walk from one end of the room to another. Until she's had at least one of these experiences, I suggest that she kindly shut the f*ck up!
I think Jordin Sparks is a positive role model for children, because she shows that you don't have to be anorexic and toothpick-thin in order to succeed in life. I wish her all the success in the world.
I am totally out of the loop and I had never heard of Jordin Sparks. So I looked her up on the internet, and saw that she is an attractive young lady of large stature. She looks like she is at least six feet tall, and she is curvaceous. But Ms. Roth can't appreciate the aesthetics of Jordin Sparks, instead prognosticating that she's a walking case of diabetes, heart disease, and cancer just waiting to happen. When asked if Sparks was obese, Roth demurred that was a question that should be settled between Sparks and her doctor, which meant "yes, but I'm too much of a coward to say it."
MeMe Roth came off as an aptly-named, self-centered, condescending prig in the interview segment, and I really think that's too bad. The goals of her organization, as outlined in her (otherwise ridiculous) letter reproduced in this blog entry, seem like noble, lofty goals. I'm all for eliminating junk food from schools, getting rid of unhealthy food additives, and stopping obesity from being handed down from one generation to the next. Unfortunately, her smug attitude undermines her organization's effectiveness.
If she had any iota of compassion in her body, Ms. Roth would realize that obesity is not something that can be shamed out of people. Obesity is a result of genetics, food choices, and exercise habits! If she truly cared about stopping obesity instead of feeding her own superiority complex, then she would exercise compassion for obese people and find constructive ways to help them make different choices when it comes to food and exercise.
I think that she must have no experience with excessive weight. She's never had to rethink all her assumptions about food, or change her eating habits. She's never been "good" for a week, only to get on the scale and find that she's gained five pounds. She's never been snubbed, discriminated against, or ridiculed for her size. She's never been told that the ugliness of her body is due to a flaw in her own character. She's never had trouble finding clothes that couldn't be used as tents for thin people. She's never experienced the humiliation of spilling over into somebody else's seat on an airplane. She's never stayed home out of self-consciousness and fear that people will stare. She's never struggled to exercise because the weight she's accumulated makes her knees creak, or makes it difficult to walk from one end of the room to another. Until she's had at least one of these experiences, I suggest that she kindly shut the f*ck up!
I think Jordin Sparks is a positive role model for children, because she shows that you don't have to be anorexic and toothpick-thin in order to succeed in life. I wish her all the success in the world.
Friday, May 25, 2007
A Justifiable Rant
Jeff and I were really struck by one of the graphs in An Inconvenient Truth. It was the one about fuel efficiency. American cars get miserable fuel efficiency compared to cars in other parts of the world, because we have very low standards. Our fuel efficiency standards are so low that we can't even sell our cars in China, because China has higher standards than we do!
American automakers argue that we mustn't raise our standards, because they can't make cars that get such high mileage. I would buy that argument, except for the fact that American companies such as Ford already sell cars (even large cars) in Europe that get upwards of 40 mpg!
My better half posted an eloquent rant on his blog. I have nothing to add to his expert takedown except a slight clarification about the mileage. The British use the Imperial gallon, which is20 fluid ounces composed of eight pints of 20 fluid ounces each, while we use the American gallon, which is 16 fl. oz. composed of eight pints of 16 fluid ounces each. So if we want to compare like to like, we have to take the British mileage and multiply it by 16/20, or 0.8, in order to get the mileages we are accustomed to.
American automakers argue that we mustn't raise our standards, because they can't make cars that get such high mileage. I would buy that argument, except for the fact that American companies such as Ford already sell cars (even large cars) in Europe that get upwards of 40 mpg!
My better half posted an eloquent rant on his blog. I have nothing to add to his expert takedown except a slight clarification about the mileage. The British use the Imperial gallon, which is
Thursday, May 24, 2007
Interesting Archive
I came across this interesting website via Thus Spake Zuska. It's Harvard University's Women Working 1800-1930 archive from their open collections, and it features "digitized historical, manuscript, and image resources selected from Harvard University's library and museum collections." There are interesting pictures, diaries, and newspaper articles from the times, including this photograph of the observatory computer room and staff from 1891 (where by "computer" they meant "computing by hand"). The women in the picture analyzed photographs from the telescope and performed calculations related to the astronomical data.
There are a lot of interesting resources at that website and I encourage you to check it out!
There are a lot of interesting resources at that website and I encourage you to check it out!
Monday, May 21, 2007
Al Gore, Climate Change, and Computing
We watched An Inconvenient Truth for the first time about a month ago. I thought it was a really good movie. I've always admired Al Gore (and I voted for him in 2000, I'm not ashamed to say).
I can't imagine how much it must have hurt to have won the popular vote but lost the election. But I was really glad to see that Al Gore has bounced back, and found his calling in spreading the word about global warming. It's something he's obviously passionate about and that he loves to share. It warmed my sentimental heart to see him doing something so wonderful. (My inner mom wants to give him a hug and kiss the top of his head, and tell him how proud I am. Something tells me that the Secret Service wouldn't let me close enough to do the deed, though.)
If you haven't yet seen An Inconvenient Truth, I'd encourage you to do so. Global climate change is an incontrovertible fact. I know a lot of folks who work in the climate modeling area, and their only agenda is to find out the truth. Nobody wants climate change, but the evidence and the models are a smoking gun. The people who model the climate use the same sorts of numerical methods that are used to model cars, airplane wings, and even our nuclear stockpile. Anyone who flies on a plane or drives a car should put as much faith in climate models as they do in finite element models of cars and airplanes.
One interesting aspect of these climate models is that they're modeling a dynamical system, meaning that a lot of effects are not linear. In other words, small perturbations can result in large changes in the system. So they start the climate models with slightly different sets of initial conditions, and run them and come up with a range of possible outcomes. That's why the predictions give a range instead of an exact value at the end.
Climate modeling takes up lots and lots of time on the supercomputers. They regularly run on thousands of processors for days at a time. As the supercomputers have gotten more powerful, they have refined the scale and added more elements to their models. It's really fascinating work, although it's not something I've ever been involved in. But the climate models could always use some load balancing, I'm sure, so maybe I could help them with that at some point!
New Science magazine has an interesting feature on climate change available online. If my meandering stream-of-consciousness discussion of Al Gore and the strange maternal pride I feel for him, climate change, and computer modeling hasn't convinced you (and why would it?), then I encourage you to check their article out.
I can't imagine how much it must have hurt to have won the popular vote but lost the election. But I was really glad to see that Al Gore has bounced back, and found his calling in spreading the word about global warming. It's something he's obviously passionate about and that he loves to share. It warmed my sentimental heart to see him doing something so wonderful. (My inner mom wants to give him a hug and kiss the top of his head, and tell him how proud I am. Something tells me that the Secret Service wouldn't let me close enough to do the deed, though.)
If you haven't yet seen An Inconvenient Truth, I'd encourage you to do so. Global climate change is an incontrovertible fact. I know a lot of folks who work in the climate modeling area, and their only agenda is to find out the truth. Nobody wants climate change, but the evidence and the models are a smoking gun. The people who model the climate use the same sorts of numerical methods that are used to model cars, airplane wings, and even our nuclear stockpile. Anyone who flies on a plane or drives a car should put as much faith in climate models as they do in finite element models of cars and airplanes.
One interesting aspect of these climate models is that they're modeling a dynamical system, meaning that a lot of effects are not linear. In other words, small perturbations can result in large changes in the system. So they start the climate models with slightly different sets of initial conditions, and run them and come up with a range of possible outcomes. That's why the predictions give a range instead of an exact value at the end.
Climate modeling takes up lots and lots of time on the supercomputers. They regularly run on thousands of processors for days at a time. As the supercomputers have gotten more powerful, they have refined the scale and added more elements to their models. It's really fascinating work, although it's not something I've ever been involved in. But the climate models could always use some load balancing, I'm sure, so maybe I could help them with that at some point!
New Science magazine has an interesting feature on climate change available online. If my meandering stream-of-consciousness discussion of Al Gore and the strange maternal pride I feel for him, climate change, and computer modeling hasn't convinced you (and why would it?), then I encourage you to check their article out.
Saturday, May 19, 2007
A Perfect Day
This morning, we had a pancake breakfast with Granny and Granddad. Vinny sat in his high chair and had a bottle while the rest of us enjoyed our pancakes.
After Granny and Granddad left, we went to the park and had a picnic lunch. We sat on a blanket under a tree, by the lake. The weather was beautiful. Vinny played with some toys, and Jeff took a bunch of pictures and also drew a portrait of Vinny and me. Then we headed over to the playground and I put Vinny into a swing. I pushed him and as he began to swing, he loved it. He smiled broadly and giggled as he swung towards me and away from me.
Then, we went home and Vinny and I took a long nap, for more than two hours. We woke up when Jeff was just about finished making dinner. We went into the dining room, and Vinny enjoyed some peas while we had dinner.
During dinner, we realized that Vinny didn't really know how to chew. So Jeff made some exaggerated chewing motions, and Vinny imitated him. Jeff chewed a cheerio, and then he gave Vinny a small piece and he chewed it (or at least tried to). I think that it was more successful than previous times when we've given him cheerios to eat.
After Granny and Granddad left, we went to the park and had a picnic lunch. We sat on a blanket under a tree, by the lake. The weather was beautiful. Vinny played with some toys, and Jeff took a bunch of pictures and also drew a portrait of Vinny and me. Then we headed over to the playground and I put Vinny into a swing. I pushed him and as he began to swing, he loved it. He smiled broadly and giggled as he swung towards me and away from me.
Then, we went home and Vinny and I took a long nap, for more than two hours. We woke up when Jeff was just about finished making dinner. We went into the dining room, and Vinny enjoyed some peas while we had dinner.
During dinner, we realized that Vinny didn't really know how to chew. So Jeff made some exaggerated chewing motions, and Vinny imitated him. Jeff chewed a cheerio, and then he gave Vinny a small piece and he chewed it (or at least tried to). I think that it was more successful than previous times when we've given him cheerios to eat.
Friday, May 18, 2007
Busy... and Exhausted
I've been offline for a while because I've been really busy at work. We hosted a workshop yesterday and today, and I had to get up extra early and got home extra late. I met a lot of new people, which was a lot of fun, and I also saw a number of old friends, which was fun too!
And the in-laws are visiting, on top of it all. They haven't gotten to see much of me, because of all that was going on at work, but I don't think that they mind too much, because really, they just came to see Vinny.
Speaking of Vinny, he's recently figured out that he can wave at people. The only thing is, and I don't know where he got this idea from, he holds his arm horizontally with the back of his hand to his chin, and wiggles his fingers. It is very strange, but very cute.
And the in-laws are visiting, on top of it all. They haven't gotten to see much of me, because of all that was going on at work, but I don't think that they mind too much, because really, they just came to see Vinny.
Speaking of Vinny, he's recently figured out that he can wave at people. The only thing is, and I don't know where he got this idea from, he holds his arm horizontally with the back of his hand to his chin, and wiggles his fingers. It is very strange, but very cute.
Wednesday, May 16, 2007
Scientiae Carnival #6 Is Up!
Scientiae Carnival #6 is up at ScienceWoman's blog. Go read it; the theme is "mothers and others; those who influenced us along the way." There are some touching tributes to mothers there.
Monday, May 14, 2007
More Moron Monday
Rachel's still in Turkey, so it's up to me to carry on the tradition. And carry it on I have!
Act the first: I forgot that it was (Moron) Monday until I was already at work, and thus unable to post. (Luckily, I had already planned to work from home this afternoon, so I could post when I take a break.)
Act the second: At work, I submitted a job on the tenth fastest supercomputer in the world, and my job ran so fast that I didn't realize it had run! When I checked the queue, my job was not in it, having run and finished before I could check the queue. And then the output file was hiding in plain sight, right there in my directory. I did not figure this out until I sent a message to the help desk about my incredible disappearing job, and the person on the other end said, "uh...dude... your job already ran." (only much more politely than that) I replied with an apology for having wasted that person's time.
How's things by you?
Act the first: I forgot that it was (Moron) Monday until I was already at work, and thus unable to post. (Luckily, I had already planned to work from home this afternoon, so I could post when I take a break.)
Act the second: At work, I submitted a job on the tenth fastest supercomputer in the world, and my job ran so fast that I didn't realize it had run! When I checked the queue, my job was not in it, having run and finished before I could check the queue. And then the output file was hiding in plain sight, right there in my directory. I did not figure this out until I sent a message to the help desk about my incredible disappearing job, and the person on the other end said, "uh...dude... your job already ran." (only much more politely than that) I replied with an apology for having wasted that person's time.
How's things by you?
Sunday, May 13, 2007
Happy Mother's Day!
Happy Mother's Day!
This is my first Mother's Day as a mother (although we did celebrate it last year, when I was pregnant). We celebrated it a little early. We went out to dinner with our friends Adam and Jody last night, at a restaurant that we'd heard about but never been to before. It was called the Copper Cellar and it was a kind of cozy place in a basement near the campus of UT. The food was delicious and we all had a good time. Vinny sat on a portable high chair that fits onto a chair, which had previously belonged to his cousin Byron.
For Mother's Day I also got a book that I had requested, Parenting Beyond Belief. I haven't finished reading it yet, but so far it is very interesting and it has given me some good ideas. I'll try to post a review of it whenever I finally finish it.
This was the best Mother's Day I've had in a long time, I guess primarily because I was able to shift the focus to celebrating myself as a mother. And I think that so far, I've done a pretty good job as a mother. Not perfect, of course, but I have a happy, healthy, thriving son and that is nothing to complain about. I'm definitely meeting the level of success that I wanted: I am the best second-rate mother in the world!
This is my first Mother's Day as a mother (although we did celebrate it last year, when I was pregnant). We celebrated it a little early. We went out to dinner with our friends Adam and Jody last night, at a restaurant that we'd heard about but never been to before. It was called the Copper Cellar and it was a kind of cozy place in a basement near the campus of UT. The food was delicious and we all had a good time. Vinny sat on a portable high chair that fits onto a chair, which had previously belonged to his cousin Byron.
For Mother's Day I also got a book that I had requested, Parenting Beyond Belief. I haven't finished reading it yet, but so far it is very interesting and it has given me some good ideas. I'll try to post a review of it whenever I finally finish it.
This was the best Mother's Day I've had in a long time, I guess primarily because I was able to shift the focus to celebrating myself as a mother. And I think that so far, I've done a pretty good job as a mother. Not perfect, of course, but I have a happy, healthy, thriving son and that is nothing to complain about. I'm definitely meeting the level of success that I wanted: I am the best second-rate mother in the world!
Labels:
all about me,
friends,
holidays,
motherhood,
Vinny
Friday, May 11, 2007
I'm Now in Heaven
Thanks to my blogfriend Tony, I now have the capability to typeset equations using What could possibly be better than that?!?! (Okay, maybe a lifetime supply of ice cream, but only by a small margin.)
Fellow Math geeks will want to take a look at this page. In honor of this new development, I shall hereby provide you with the only trigonometry identity you will ever need to know (all the others can be derived from this one):
Fellow Math geeks will want to take a look at this page. In honor of this new development, I shall hereby provide you with the only trigonometry identity you will ever need to know (all the others can be derived from this one):
Ask an Applied Mathematician, Math Education Edition
Devoted reader Tony asked (in this entry):
Sure, Tony. I am not an educator myself, but I have been educated, and based on my experience I have a few ideas of what not to do:
BAD IDEA #1:
When introducing the different sets of numbers (e.g. the reals, the integers, the complex, the irrational numbers), end the lesson by discussing your favorite set of numbers: Cindy Crawford's measurements. (True story, happened to my sister at a math/science/technology high school magnet program.)
BAD IDEA #2:
When male classmates humiliate your best pupil, making fun of her for being female and being interested in joining the computer club, stand idly by and let her take care of it herself. (Happened to me in 7th grade.)
BAD IDEA #3:
Frame an otherwise entertaining story about computing the power usage of your computer within the context of your math-illiterate wife blaming you for the high power bill. For extra bonus points, explicitly tell the students that women can't do math. (This happened in a classroom I was observing.)
BAD IDEA #4:
When handing back exams, place the person's exam on their desk, but then grab it back, leaf through to problem number three, and exclaim, "Oh yeah, don't do problem number three [this way]. Only a stupid person would do it that way." (True story, I was the "stupid person" in question, although it goes without saying that you shouldn't do this to anybody, independent of their gender.)
Of course I think you are not likely to do any of the above, based on what I know of you. But there are subtler things that can occur to discourage girls in the math/science classroom.
The classroom should be a safe place, a place where no one is judged on their gender, race, religion, sexual orientation, etc., and I would encourage you to strive for that. I think that girls will thrive in a class where they feel safe from harassment, intimidation, etc. coming from their peers. I know that I performed a lot better when I knew that the teacher would come down hard on the people who made fun of me outside of class. I also felt a lot more comfortable when I knew the teacher wasn't going to pick on me, either. Basically I guess I liked the teachers who were benevolent dictators.
There's a phenomenon known as stereotype threat, in which people, when reminded of their status as members of traditionally underperforming groups, perform worse on exams. They don't even have to believe that any stereotypes about their groups are true; they still perform worse than they would have if they hadn't been reminded. The more the differences are played up, the worse the members of the threatened group perform. It even works on white males, if they're told just before taking a math test that Asians perform better on the test. Stereotype threat accounts for at least some of the underperformance that we see in the scores of certain groups on standardized tests, such as the SAT. But it does not account for all.
Unfortunately, society reinforces the idea that women cannot do math in many ways. There was the talking Barbie who declared that "Math is hard!" There are the pajamas for toddlers I saw at the store; you could choose from the "boy" designs with astronauts, athletes, and engineers, and the "girl" designs with ballerinas, pop stars, and fashion models. And the toy section with a doctor's kit aimed at boys and a nurse's kit aimed at girls, despite the fact that in this country, there are more women in medical school than men. And the articles in the newspaper depicting particle physics as a fun game that men play. And the online brochures for math and science departments, depicting their glorious department full of white (and sometimes Asian) men.
There's not much that you can do in the classroom to counter the sexism ingrained in our society. But you can make your classroom a safe space, where the sexism gets checked at the door. And taking at least a little bit of time to discuss the accomplishments of women mathematicians, and women scientists could at least help the girls who are interested in math and science feel less like freaks.
I think this is probably a very incomplete answer, and I wish I could make it more complete. I'm cross-listing this entry with the scientiae carnival, in hopes that others will contribute more ideas for you. Comments are very welcome.
Got a question for the Applied Mathematician? Leave it as a comment, or e-mail me!
scientiae-carnival
I am very interested in hearing more about these inadvertent sexist syntaxes. I constantly strive to eradicate biased behavior from myself, and since I am planning to become a math/science teacher, this issue has become even more important to me. I am especially concerned with the attitudes that today's young woman have toward mathematics and I certainly want to make things better, not worse. Could you maybe do an Ask an Applied Mathematician post about fairness in the classroom?
Sure, Tony. I am not an educator myself, but I have been educated, and based on my experience I have a few ideas of what not to do:
BAD IDEA #1:
When introducing the different sets of numbers (e.g. the reals, the integers, the complex, the irrational numbers), end the lesson by discussing your favorite set of numbers: Cindy Crawford's measurements. (True story, happened to my sister at a math/science/technology high school magnet program.)
BAD IDEA #2:
When male classmates humiliate your best pupil, making fun of her for being female and being interested in joining the computer club, stand idly by and let her take care of it herself. (Happened to me in 7th grade.)
BAD IDEA #3:
Frame an otherwise entertaining story about computing the power usage of your computer within the context of your math-illiterate wife blaming you for the high power bill. For extra bonus points, explicitly tell the students that women can't do math. (This happened in a classroom I was observing.)
BAD IDEA #4:
When handing back exams, place the person's exam on their desk, but then grab it back, leaf through to problem number three, and exclaim, "Oh yeah, don't do problem number three [this way]. Only a stupid person would do it that way." (True story, I was the "stupid person" in question, although it goes without saying that you shouldn't do this to anybody, independent of their gender.)
Of course I think you are not likely to do any of the above, based on what I know of you. But there are subtler things that can occur to discourage girls in the math/science classroom.
The classroom should be a safe place, a place where no one is judged on their gender, race, religion, sexual orientation, etc., and I would encourage you to strive for that. I think that girls will thrive in a class where they feel safe from harassment, intimidation, etc. coming from their peers. I know that I performed a lot better when I knew that the teacher would come down hard on the people who made fun of me outside of class. I also felt a lot more comfortable when I knew the teacher wasn't going to pick on me, either. Basically I guess I liked the teachers who were benevolent dictators.
There's a phenomenon known as stereotype threat, in which people, when reminded of their status as members of traditionally underperforming groups, perform worse on exams. They don't even have to believe that any stereotypes about their groups are true; they still perform worse than they would have if they hadn't been reminded. The more the differences are played up, the worse the members of the threatened group perform. It even works on white males, if they're told just before taking a math test that Asians perform better on the test. Stereotype threat accounts for at least some of the underperformance that we see in the scores of certain groups on standardized tests, such as the SAT. But it does not account for all.
Unfortunately, society reinforces the idea that women cannot do math in many ways. There was the talking Barbie who declared that "Math is hard!" There are the pajamas for toddlers I saw at the store; you could choose from the "boy" designs with astronauts, athletes, and engineers, and the "girl" designs with ballerinas, pop stars, and fashion models. And the toy section with a doctor's kit aimed at boys and a nurse's kit aimed at girls, despite the fact that in this country, there are more women in medical school than men. And the articles in the newspaper depicting particle physics as a fun game that men play. And the online brochures for math and science departments, depicting their glorious department full of white (and sometimes Asian) men.
There's not much that you can do in the classroom to counter the sexism ingrained in our society. But you can make your classroom a safe space, where the sexism gets checked at the door. And taking at least a little bit of time to discuss the accomplishments of women mathematicians, and women scientists could at least help the girls who are interested in math and science feel less like freaks.
I think this is probably a very incomplete answer, and I wish I could make it more complete. I'm cross-listing this entry with the scientiae carnival, in hopes that others will contribute more ideas for you. Comments are very welcome.
Got a question for the Applied Mathematician? Leave it as a comment, or e-mail me!
scientiae-carnival
Ask an Applied Mathematician, Supercomputing Paradigm Edition
Fearless reader Marius asked me what I meant by the "stagnating supercomputing paradigm" in the previous installment of "Ask an Applied Mathematician." It was a good question, deserving of my time to write a good answer. What follows is a long answer, but I think that it's a good answer, too!
For any of my answer to make sense, we need to discuss how the current computing paradigm came about. For the larger perspective, I rely on the expertise of Bill Camp, recently retired from Sandia National Laboratories, who visited our lab last year and gave a very eye-opening talk.
In the 1950's, John von Neumann laid the foundation of modern computing by proposing the von Neumann computer architecture. The idea of the von Neumann architecture is that the computer has a processor, and a storage component used for storing both the program and the data. The program and data can be modified. This was a big change from earlier computers, which were more like calculators, in the sense that they had a single program hard-wired inside, and if you wanted to change the program, you had to rewire the machine. Now all you had to do was input a new program, and that was made even easier by the invention of FORTRAN.
In the 1960's, the vector processor was invented, which sped up computations by performing calculations on an array of numbers simultaneously. In the 1970's, Cray came out with a vector supercomputer, the Cray 1, which famously doubles as a couch (click on the link to see the Wikipedia article, including a picture). Also in the 1970's DEC came out with the computer on a chip (a CPU, memory, and I/O lines all on one chip), which in the long term ended up revolutionizing the supercomputing industry.
But the vector-supercomputer honeymoon period was rather short-lived. In the 1980's, Amdahl's law (a rule of thumb showing that the speedup of a program is constrained by the percentage of the program that must be performed in serial) was interpreted as a very pessimistic law of diminishing returns. Camp described the bleak outlook of the times: "Parallelize = paralyze."
Obviously today, nothing of the sort is true. The high-performance computing (HPC) industry is booming, and developing the biggest, fastest machines has become a national security priority. So what happened to change this?
What happened was that some "disruptive technologies" came along and revitalized the industry. MPP (massively parallel processing) was a revolutionary idea that was mostly dismissed by the industry. But Sandia was willing to take a chance on it. They invested in a 1024-processor nCube, and the benchmarking numbers were so amazing that several prominent scientists accused them of fudging their results! But the numbers were real.
So this is the first reason that I think we need to be aware that the predominant programming paradigm may not last. Another disruptive technology may come along and displace the current one. Indeed, the high-performance computing world is aware of this fact and at SC '07, they are holding major events centering around disruptive technologies.
As it turned out, "inner loop" parallelism (the parallelism exploited by a vector processor) really is limited by Amdahl's law. But "outer loop" parallelism, which can be exploited through distributing the work over multiple processors that do not share memory, is much more forgiving. So eventually those prominent scientists saw that Sandia was not cheating, and the MPP paradigm grew to dominate the field.
Continuing along the timeline of computing history, in the early 1990's a lot of supercomputer companies went under, and there was a Great Depression in the HPC industry. Out of desperation, some people started hooking a bunch of computers together, operating under Linux, into a Beowulf cluster. As Linux matured and the price on commodity parts came down, this architecture became more and more viable and affordable.
In 2002, a single event revitalized the American HPC industry. The behemoth Earth Simulator machine came online. It was a Japanese project that somehow escaped American notice, despite its five-year development plan and $350 million price tag. I attended SC 2002, and I remember hearing the uproar from my colleagues when, for the first time, the top spot on the Top 500 list was claimed by a non-American machine. It was at that point that we in this country took up the gauntlet to develop faster machines. Cray launched the XT-1, and IBM announced plans for the Blue Gene, a very innovative machine that claims the top spot on the current Top 500 list.
Today's machines are based upon the same concepts that the earlier MPP machines were. The major differences are that today, we have faster processors, faster interconnects, and better software, and we use more processors and interconnects in our machines. Sandia's 1024-node nCube was huge at the time; just yesterday I was working on a machine with more than 20,000 processors; and the top machine on the current Top 500 list has more than 100,000 processors! The power that it takes to run these machines could power a small city. The cooling system for my lab's machine room requires all the water from a water main that is the same diameter as the one that supplies my town of approximately 25,000 people. How much bigger can we go?
Power and water constraints aside, there are limitations in how much more we can improve these big machines' hardware and software. Moore's law says that computing power roughly doubles every 18 months. But there is a limit to that, because processors are limited by atomic size constraints. You can shrink a processor so small, but not smaller than the size of a couple of atoms. Similarly, we have good interconnects now, and optical interconnects are hitting the market soon, but ultimately we are constrained by three-dimensional geometry and the speed of light. I was trying to avoid talking about memory and I/O, but those are also troublesome aspects of the equation. Bigger, faster machines invite bigger problems, resulting in bigger outputs. Some programs, such as climate models, produce outputs on the order of tens or even hundreds of terabytes. We have to somehow come up with a way to put these huge outputs on disk. The I/O problem is the biggest bottleneck in today's machines, and there is really no scalable solution.
Optimizing the operating system, libraries, and compiler is another way to squeeze better performance out of a machine. IBM's Blue Gene project was revolutionary in that. Instead of Linux, the compute nodes ran a stripped-down, optimized version of the Linux kernel. Cray did a similar thing with their Catamount operating system, and these tweaks have squeezed a lot of performance out of these big machines. But eventually, there won't be any more tweaks to make. And we'll be stagnating.
Even now, it's hard to scale programs up to the level of 10,000 processors. A program that scales well from four to eight processors, for example (meaning the run time on 8 processors is half the run time on 4 processors), doesn't always scale well beyond that. Typically, as we increase the number of processors running a program, the program will scale well at first, but then the performance gain trails off, and in fact it may even lose performance at a certain point. Even embarrassingly parallel programs run into this problem, because using more processes translates to doing more communication, especially as a means to overcome the I/O bottleneck I described earlier. As computers get bigger and bigger, it will become even harder to take advantage of the state-of-the-art.
So, now would be a perfect time for a disruptive technology to enter from left field. It could be quantum computing. With quantum computing, you could perform on a single processor what it takes a two-acre machine room and enough power and water for a small city to perform today. Will they get quantum computing working, or will some currently unknown disruptive technology save us?
Got a question for the Applied Mathematician? Leave it as a comment, or e-mail me!
For any of my answer to make sense, we need to discuss how the current computing paradigm came about. For the larger perspective, I rely on the expertise of Bill Camp, recently retired from Sandia National Laboratories, who visited our lab last year and gave a very eye-opening talk.
In the 1950's, John von Neumann laid the foundation of modern computing by proposing the von Neumann computer architecture. The idea of the von Neumann architecture is that the computer has a processor, and a storage component used for storing both the program and the data. The program and data can be modified. This was a big change from earlier computers, which were more like calculators, in the sense that they had a single program hard-wired inside, and if you wanted to change the program, you had to rewire the machine. Now all you had to do was input a new program, and that was made even easier by the invention of FORTRAN.
In the 1960's, the vector processor was invented, which sped up computations by performing calculations on an array of numbers simultaneously. In the 1970's, Cray came out with a vector supercomputer, the Cray 1, which famously doubles as a couch (click on the link to see the Wikipedia article, including a picture). Also in the 1970's DEC came out with the computer on a chip (a CPU, memory, and I/O lines all on one chip), which in the long term ended up revolutionizing the supercomputing industry.
But the vector-supercomputer honeymoon period was rather short-lived. In the 1980's, Amdahl's law (a rule of thumb showing that the speedup of a program is constrained by the percentage of the program that must be performed in serial) was interpreted as a very pessimistic law of diminishing returns. Camp described the bleak outlook of the times: "Parallelize = paralyze."
Obviously today, nothing of the sort is true. The high-performance computing (HPC) industry is booming, and developing the biggest, fastest machines has become a national security priority. So what happened to change this?
What happened was that some "disruptive technologies" came along and revitalized the industry. MPP (massively parallel processing) was a revolutionary idea that was mostly dismissed by the industry. But Sandia was willing to take a chance on it. They invested in a 1024-processor nCube, and the benchmarking numbers were so amazing that several prominent scientists accused them of fudging their results! But the numbers were real.
So this is the first reason that I think we need to be aware that the predominant programming paradigm may not last. Another disruptive technology may come along and displace the current one. Indeed, the high-performance computing world is aware of this fact and at SC '07, they are holding major events centering around disruptive technologies.
As it turned out, "inner loop" parallelism (the parallelism exploited by a vector processor) really is limited by Amdahl's law. But "outer loop" parallelism, which can be exploited through distributing the work over multiple processors that do not share memory, is much more forgiving. So eventually those prominent scientists saw that Sandia was not cheating, and the MPP paradigm grew to dominate the field.
Continuing along the timeline of computing history, in the early 1990's a lot of supercomputer companies went under, and there was a Great Depression in the HPC industry. Out of desperation, some people started hooking a bunch of computers together, operating under Linux, into a Beowulf cluster. As Linux matured and the price on commodity parts came down, this architecture became more and more viable and affordable.
In 2002, a single event revitalized the American HPC industry. The behemoth Earth Simulator machine came online. It was a Japanese project that somehow escaped American notice, despite its five-year development plan and $350 million price tag. I attended SC 2002, and I remember hearing the uproar from my colleagues when, for the first time, the top spot on the Top 500 list was claimed by a non-American machine. It was at that point that we in this country took up the gauntlet to develop faster machines. Cray launched the XT-1, and IBM announced plans for the Blue Gene, a very innovative machine that claims the top spot on the current Top 500 list.
Today's machines are based upon the same concepts that the earlier MPP machines were. The major differences are that today, we have faster processors, faster interconnects, and better software, and we use more processors and interconnects in our machines. Sandia's 1024-node nCube was huge at the time; just yesterday I was working on a machine with more than 20,000 processors; and the top machine on the current Top 500 list has more than 100,000 processors! The power that it takes to run these machines could power a small city. The cooling system for my lab's machine room requires all the water from a water main that is the same diameter as the one that supplies my town of approximately 25,000 people. How much bigger can we go?
Power and water constraints aside, there are limitations in how much more we can improve these big machines' hardware and software. Moore's law says that computing power roughly doubles every 18 months. But there is a limit to that, because processors are limited by atomic size constraints. You can shrink a processor so small, but not smaller than the size of a couple of atoms. Similarly, we have good interconnects now, and optical interconnects are hitting the market soon, but ultimately we are constrained by three-dimensional geometry and the speed of light. I was trying to avoid talking about memory and I/O, but those are also troublesome aspects of the equation. Bigger, faster machines invite bigger problems, resulting in bigger outputs. Some programs, such as climate models, produce outputs on the order of tens or even hundreds of terabytes. We have to somehow come up with a way to put these huge outputs on disk. The I/O problem is the biggest bottleneck in today's machines, and there is really no scalable solution.
Optimizing the operating system, libraries, and compiler is another way to squeeze better performance out of a machine. IBM's Blue Gene project was revolutionary in that. Instead of Linux, the compute nodes ran a stripped-down, optimized version of the Linux kernel. Cray did a similar thing with their Catamount operating system, and these tweaks have squeezed a lot of performance out of these big machines. But eventually, there won't be any more tweaks to make. And we'll be stagnating.
Even now, it's hard to scale programs up to the level of 10,000 processors. A program that scales well from four to eight processors, for example (meaning the run time on 8 processors is half the run time on 4 processors), doesn't always scale well beyond that. Typically, as we increase the number of processors running a program, the program will scale well at first, but then the performance gain trails off, and in fact it may even lose performance at a certain point. Even embarrassingly parallel programs run into this problem, because using more processes translates to doing more communication, especially as a means to overcome the I/O bottleneck I described earlier. As computers get bigger and bigger, it will become even harder to take advantage of the state-of-the-art.
So, now would be a perfect time for a disruptive technology to enter from left field. It could be quantum computing. With quantum computing, you could perform on a single processor what it takes a two-acre machine room and enough power and water for a small city to perform today. Will they get quantum computing working, or will some currently unknown disruptive technology save us?
Got a question for the Applied Mathematician? Leave it as a comment, or e-mail me!
Monday, May 07, 2007
A Visitor, and More Double Standards
Last week I hosted a visitor at work: a senior member of the technical staff at another national lab. Okay, it wasn't actually as glamorous as it sounds. The visitor was a classmate of mine from graduate school, and "senior member of the technical staff," as distinguished as it sounds, is actually the lowest level for Ph.D's. (If you have a masters, you start out as just a "member of the technical staff.")
Anyhow, he's a good friend of mine in addition to being a top-notch scholar, so it was fun to catch up on his life as well as hear about his latest research. He went out to eat with me and Jeff and Vinny one night, so he got a chance to meet Vinny for the first time. He and his wife have a son, and another one on the way, so he wasn't spooked by hanging out with us and our baby; indeed, he was actually excited to meet Vinny.
In addition to talking about our research, we also talked about our work environments, and I think I'm really glad to be working where I work. Although he makes a lot more money than I do, I wouldn't want to trade jobs.
We also had an interesting discussion about what it's like to be a woman in science. Obviously my friend is male, plus he's white, so he has no experience being the "other" at work. He is not obviously biased against women but I told him about the sort of unconscious bias that he could inadvertently get caught up in. For example, people tend to describe men as "original thinkers" and "very smart" in letters of recommendation, whereas they tend to describe women as "hard workers" and "a pleasure to work with." If you didn't know about this bias, you might select the "original thinker" rather than the "hard worker," when in reality they could both be equally good candidates. I personally saw a lot of this sort of letter-writing bias when I was on the department graduate admissions committee as a graduate student. So it's definitely something to watch out for; otherwise you may be inadvertently perpetuating bias. He was completely oblivious to this fact until I pointed it out to him.
I talked about the door-opening incident before, but another thing that gets me sometimes is the fact that many men often do not extend their hand to shake mine, whereas they will readily offer their hand to another man. According to some etiquette experts, men should not initiate a handshake with a woman, although this view should not apply in the business setting. But when I took my visitor to meet some men whom I didn't know either, they readily shook my visitor's hand but not mine.
My boss is male but he's not white, so he at least has an idea of what it's like to be treated differently, even if his experience isn't the same as mine. He's no stranger to racism, and he's sensitive enough to lend a sympathetic ear when I feel the need to vent. I don't get angry about any one incident, but someone once made a very apt metaphor to describe the additive effect of these little slights: it's like death by papercuts. Oh sure, one, two, even ten of them aren't going to do much damage. But over the course of a lifetime, it hurts a lot.
But, back to my guest. He was charming, everybody was impressed with him, and he had a great time. He invited me out to his lab, and I think I will pay him a visit in the fall.
Anyhow, he's a good friend of mine in addition to being a top-notch scholar, so it was fun to catch up on his life as well as hear about his latest research. He went out to eat with me and Jeff and Vinny one night, so he got a chance to meet Vinny for the first time. He and his wife have a son, and another one on the way, so he wasn't spooked by hanging out with us and our baby; indeed, he was actually excited to meet Vinny.
In addition to talking about our research, we also talked about our work environments, and I think I'm really glad to be working where I work. Although he makes a lot more money than I do, I wouldn't want to trade jobs.
We also had an interesting discussion about what it's like to be a woman in science. Obviously my friend is male, plus he's white, so he has no experience being the "other" at work. He is not obviously biased against women but I told him about the sort of unconscious bias that he could inadvertently get caught up in. For example, people tend to describe men as "original thinkers" and "very smart" in letters of recommendation, whereas they tend to describe women as "hard workers" and "a pleasure to work with." If you didn't know about this bias, you might select the "original thinker" rather than the "hard worker," when in reality they could both be equally good candidates. I personally saw a lot of this sort of letter-writing bias when I was on the department graduate admissions committee as a graduate student. So it's definitely something to watch out for; otherwise you may be inadvertently perpetuating bias. He was completely oblivious to this fact until I pointed it out to him.
I talked about the door-opening incident before, but another thing that gets me sometimes is the fact that many men often do not extend their hand to shake mine, whereas they will readily offer their hand to another man. According to some etiquette experts, men should not initiate a handshake with a woman, although this view should not apply in the business setting. But when I took my visitor to meet some men whom I didn't know either, they readily shook my visitor's hand but not mine.
My boss is male but he's not white, so he at least has an idea of what it's like to be treated differently, even if his experience isn't the same as mine. He's no stranger to racism, and he's sensitive enough to lend a sympathetic ear when I feel the need to vent. I don't get angry about any one incident, but someone once made a very apt metaphor to describe the additive effect of these little slights: it's like death by papercuts. Oh sure, one, two, even ten of them aren't going to do much damage. But over the course of a lifetime, it hurts a lot.
But, back to my guest. He was charming, everybody was impressed with him, and he had a great time. He invited me out to his lab, and I think I will pay him a visit in the fall.
Moron Monday
Normally, this is a feature over at my sister's place, but she's a big turkey... I mean, she's in Turkey so she asked her sisters to step in. For those of you who aren't actually readers of Rachel's blog, the idea is that you confess to ways in which you have been a moron recently, and we all get a big laugh comparing stories. For the complete rules, see this page.
My moronic moment (was it only a moment?) this week occurred when I spent an afternoon chasing after a bug in my program, only to discover that it was a typo. A lower-case c in the middle of a variable, where there should have been a capital C. C and c look remarkably alike; I think if it had been A and a I would have seen it almost immediately. (The rest of the paragraph is computer science jibberish; feel free to ignore it.) But instead I decided that I needed to instantiate that templated class in a .cc file, despite the fact that it was wholly defined in a header file and I knew that there was no need to instantiate a class defined solely in a header file. Then I got pissed because I kept fiddling with the instantiation and nothing helped.
What did you do?
My moronic moment (was it only a moment?) this week occurred when I spent an afternoon chasing after a bug in my program, only to discover that it was a typo. A lower-case c in the middle of a variable, where there should have been a capital C. C and c look remarkably alike; I think if it had been A and a I would have seen it almost immediately. (The rest of the paragraph is computer science jibberish; feel free to ignore it.) But instead I decided that I needed to instantiate that templated class in a .cc file, despite the fact that it was wholly defined in a header file and I knew that there was no need to instantiate a class defined solely in a header file. Then I got pissed because I kept fiddling with the instantiation and nothing helped.
What did you do?
Sunday, May 06, 2007
Ask an Applied Mathematician
Astute reader Marius, a graduate student in aerospace engineering with a penchant for optimization and computing, had a question for the applied mathematician:
I can give you an answer based entirely on my experience in the computing field. Like you, I did not have a computing background, until I threw myself to the wolves, so to speak, by entering graduate school in computer science.
By far the most important knowledge is that of algorithm development. You're going to be using a computer as a tool to find out whatever it is you actually want to find out. So, knowing how to use it well is what's going to take you the farthest.
When I say "algorithm development," what I mean is understanding how to convert the math into something a computer can do, and going about it in an intelligent, systematic, and efficient manner.
I think it is important to know one programming language well. By "well," what I really mean is that you should be able to write a fairly complicated code without much peeking at a book or online. Notice I say "without much peeking," because I have a memory like a sieve so storing certain things, like precisely how to open and write to a file or just exactly what the name of the floor function is, is a low priority due to the limited capacity of my brain. But you should be fluent enough in the programming language so that your limited vocabulary is not a major stumbling block in writing code. (What I often do when I'm in the depths of development is to write myself a comment in the place I need to open the file, and then come back later with the proper syntax.)
Once you know one programming language, it is relatively easy to pick up on how other languages work, and you should be able to do a decent job of updating other people's codes or using them as subroutines. Of course you would want to write original code in your preferred language whenever you can.
As for what programming language, I'm not interested in starting a programming language war, but if you're wanting to do the sorts of things I'm envisioning that you want to do, you'll want to be fluent in some sort of mainstream programming language such as C++, C, or FORTRAN. Personally, the vast majority of the work I do is in C++ these days, and I'd recommend it because C and FORTRAN are more limited in terms of what you can do with them. To me, it's really nice to be able to write using just about any type of programming paradigm: procedural, object-oriented; you name it, you can do it with C++. Of course, C++ also enables you to shoot yourself in the foot that much easier. I would suggest Java, which is a little safer than C++, but programs in Java run slower than programs in C++, and more importantly, Java doesn't have all the stuff that you will need for your codes, such as parallel extensions.
And speaking of parallel, I'd recommend reading up on MPI (Message Passing Interface, the industry standard parallel libraries), because parallelism is very important and very useful, especially if you're writing a big simulation code (in that case, parallelism is essential). If you're not familiar with MPI, there are a whole bunch of really good tutorials out there. (If you want a bigger picture tutorial that will teach you how to use a supercomputer, google for supercomputing crash course, and take the first link.)
As for the other stuff, a basic understanding of how a computer works is helpful, just so that you can learn to think like a computer. If you can think like a computer, that will help you to program the computer. But don't get worried about all the little details, because you want to write code that is platform-independent since for all you know, we are on the verge of a new computer model. (In fact, we will need to come up with one soon, as the current paradigm will soon stagnate.) Let a crazy computer scientist who actually enjoys this sort of thing optimize your code so that it will run fast on the latest machine! (Note: I am not that sort of computer scientist. I am crazy in a different way.)
I hope this has been a helpful answer, Marius, and if you have any other questions or need me to explain something further, please do not hesitate to ask.
Got a question for the Applied Mathematician? Leave it as a comment, or e-mail me!
What 'computer science' knowledge do you think is most important?Excellent question, Marius!
Knowing a range of languages, knowing the internal details of the machines, strategies of how to structure your code, anything else?
I can give you an answer based entirely on my experience in the computing field. Like you, I did not have a computing background, until I threw myself to the wolves, so to speak, by entering graduate school in computer science.
By far the most important knowledge is that of algorithm development. You're going to be using a computer as a tool to find out whatever it is you actually want to find out. So, knowing how to use it well is what's going to take you the farthest.
When I say "algorithm development," what I mean is understanding how to convert the math into something a computer can do, and going about it in an intelligent, systematic, and efficient manner.
I think it is important to know one programming language well. By "well," what I really mean is that you should be able to write a fairly complicated code without much peeking at a book or online. Notice I say "without much peeking," because I have a memory like a sieve so storing certain things, like precisely how to open and write to a file or just exactly what the name of the floor function is, is a low priority due to the limited capacity of my brain. But you should be fluent enough in the programming language so that your limited vocabulary is not a major stumbling block in writing code. (What I often do when I'm in the depths of development is to write myself a comment in the place I need to open the file, and then come back later with the proper syntax.)
Once you know one programming language, it is relatively easy to pick up on how other languages work, and you should be able to do a decent job of updating other people's codes or using them as subroutines. Of course you would want to write original code in your preferred language whenever you can.
As for what programming language, I'm not interested in starting a programming language war, but if you're wanting to do the sorts of things I'm envisioning that you want to do, you'll want to be fluent in some sort of mainstream programming language such as C++, C, or FORTRAN. Personally, the vast majority of the work I do is in C++ these days, and I'd recommend it because C and FORTRAN are more limited in terms of what you can do with them. To me, it's really nice to be able to write using just about any type of programming paradigm: procedural, object-oriented; you name it, you can do it with C++. Of course, C++ also enables you to shoot yourself in the foot that much easier. I would suggest Java, which is a little safer than C++, but programs in Java run slower than programs in C++, and more importantly, Java doesn't have all the stuff that you will need for your codes, such as parallel extensions.
And speaking of parallel, I'd recommend reading up on MPI (Message Passing Interface, the industry standard parallel libraries), because parallelism is very important and very useful, especially if you're writing a big simulation code (in that case, parallelism is essential). If you're not familiar with MPI, there are a whole bunch of really good tutorials out there. (If you want a bigger picture tutorial that will teach you how to use a supercomputer, google for supercomputing crash course, and take the first link.)
As for the other stuff, a basic understanding of how a computer works is helpful, just so that you can learn to think like a computer. If you can think like a computer, that will help you to program the computer. But don't get worried about all the little details, because you want to write code that is platform-independent since for all you know, we are on the verge of a new computer model. (In fact, we will need to come up with one soon, as the current paradigm will soon stagnate.) Let a crazy computer scientist who actually enjoys this sort of thing optimize your code so that it will run fast on the latest machine! (Note: I am not that sort of computer scientist. I am crazy in a different way.)
I hope this has been a helpful answer, Marius, and if you have any other questions or need me to explain something further, please do not hesitate to ask.
Got a question for the Applied Mathematician? Leave it as a comment, or e-mail me!
Adventures in Double Standards
When I returned to work, many people asked me who was taking care of the baby. If I'd been a man, nobody would have asked that question. It's because women are supposed to take care of babies, whereas men are not. Even government studies define father care as daycare, not parent care! Additionally, fathers are portrayed as clueless incompetents, unsafe to leave alone with their children.
Fathers are painted as incapable buffoons in commercials, movies, books, etc. I've seen more commercials than I can count in which Dad can't control the kids and Mom saves the day (using the product being advertised, of course!). Movies get a lot of comedic mileage out of incapable Dad messing everything up while Mom is out. Mr. Mom is but one in a long line of such movies.
Combine this perceived Dad incompetence with the fact that childrearing is not a prestigious activity, and you have the reason why this stereotype is still pervasive. Women used to be perceived as incapable of rigorous academic work, for example, but because it was prestigious and rewarding, we fought the stereotype and entered into science (and many other fields). But many men feel no motivation to fight the incapable father stereotype, because if men were actually thought of as competent parents, then that would mean more (low-prestige) work for them.
So that leaves us with a society where a man taking care of his child is "babysitting," where my stay-at-home-dad husband goes to the store with our son and everybody asks "Where's your wife?", where people look down on him for making a choice that they wouldn't think twice about a woman making.
Fathers are painted as incapable buffoons in commercials, movies, books, etc. I've seen more commercials than I can count in which Dad can't control the kids and Mom saves the day (using the product being advertised, of course!). Movies get a lot of comedic mileage out of incapable Dad messing everything up while Mom is out. Mr. Mom is but one in a long line of such movies.
Combine this perceived Dad incompetence with the fact that childrearing is not a prestigious activity, and you have the reason why this stereotype is still pervasive. Women used to be perceived as incapable of rigorous academic work, for example, but because it was prestigious and rewarding, we fought the stereotype and entered into science (and many other fields). But many men feel no motivation to fight the incapable father stereotype, because if men were actually thought of as competent parents, then that would mean more (low-prestige) work for them.
So that leaves us with a society where a man taking care of his child is "babysitting," where my stay-at-home-dad husband goes to the store with our son and everybody asks "Where's your wife?", where people look down on him for making a choice that they wouldn't think twice about a woman making.
Labels:
all about me,
family,
feminism,
motherhood,
stupid,
Vinny,
work
Saturday, May 05, 2007
Nerd-in-Training
Vinny is our sweet little nerd-in-training.
He is fascinated by glasses. When my face comes close enough to his, he reaches out and swipes my glasses off my face. He will try to grab Jeff's off his face too, but Jeff has a band that goes around the back of his head and keeps his glasses safe. Sometimes, just for fun, we put our glasses on Vinny's face. He loves wearing glasses. It's very strange. I've never heard of any other baby who loved glasses so much. But it's a good thing, because I suspect that corrective lenses are an inevitable part of his future. Here is an early picture illustrating his love of glasses:
He is also fascinated by the computer. Oh, how he longs to type on the keyboard and click the mouse! Jeff says that Vinny has almost ordered things on eBay before he realizes it. We have an old keyboard that we let Vinny use, and he is so excited about it and just loves to play with it.
But the biggest indication that he's a nerd-in-training is the following. He's recently developed his own geeky laugh: this breathy, hoarse laugh that he does when something tickles his funnybone. I think he's laughing while inhaling. It's really adorable and it cracks me up every time he does it.
He is fascinated by glasses. When my face comes close enough to his, he reaches out and swipes my glasses off my face. He will try to grab Jeff's off his face too, but Jeff has a band that goes around the back of his head and keeps his glasses safe. Sometimes, just for fun, we put our glasses on Vinny's face. He loves wearing glasses. It's very strange. I've never heard of any other baby who loved glasses so much. But it's a good thing, because I suspect that corrective lenses are an inevitable part of his future. Here is an early picture illustrating his love of glasses:
He is also fascinated by the computer. Oh, how he longs to type on the keyboard and click the mouse! Jeff says that Vinny has almost ordered things on eBay before he realizes it. We have an old keyboard that we let Vinny use, and he is so excited about it and just loves to play with it.
But the biggest indication that he's a nerd-in-training is the following. He's recently developed his own geeky laugh: this breathy, hoarse laugh that he does when something tickles his funnybone. I think he's laughing while inhaling. It's really adorable and it cracks me up every time he does it.
Friday, May 04, 2007
Weight Loss Secret #5492
For the past couple of years, I have chewed gum nearly every weekday (except when I was pregnant).
Upon reflection, I figured out a few years ago that sometimes, when I think I'm hungry, I'm actually just in the mood to chew. After I figured that out, I began to chew gum regularly.
Gum is an awesome help in my battle against the extra pounds. I pop a piece in my mouth every afternoon at work, and it helps me in the following ways:
Upon reflection, I figured out a few years ago that sometimes, when I think I'm hungry, I'm actually just in the mood to chew. After I figured that out, I began to chew gum regularly.
Gum is an awesome help in my battle against the extra pounds. I pop a piece in my mouth every afternoon at work, and it helps me in the following ways:
- It gives me something to chew on, so that I don't go looking for food that I don't actually need because I'm not feeling hungry.
- It puts a pleasant flavor in my mouth and freshens my breath (my favorite flavor: wintergreen).
- If someone offers me some food, it is easy to decline because I am enjoying my gum.
- It cleans my teeth a little (I always chew sugarless gum).
Subscribe to:
Posts (Atom)