After many unsuccessful attempts, I FINALLY posted my flash animation successfully on my website! Here is a run-down of how it all happened:
1. I download Adobe Photoshop CS4 without realizing that you can't make flash animations in Photoshop (duh). I return to the Adobe website to download the correct program, Flash CS4.
2. I import a bunch of Christmas-y images to create A Very Special Christmas Flash, and play around with the different ways of manipulating the pictures to become a flash animation. (This was surprisingly the easiest part of the process.)
3. I save my work as a ".fla" file. I import this into my public.html directory and find that is NOT the way to post a flash on the internet.
4. I publish my work and import the "flash.html" into my public.html directory. Again it doesn't work.
5. I discover you need both the .html and the .swf files in one directory for the flash to run. Accordingly, I import both into public.html and still it doesn't work.
6. I find out that renaming the files after you have published will prevent the animation from playing. So, I import the .html and .swf files into public.html exactly as I named them when I published, and, voila, my animation appears.
I was often frustrated through this process because I didn't know what I was doing wrong, and even after searching the internet I couldn't find the answer. It seemed the only way to figure out the problem was with the help of the professor. (Thankfully the professor is almost always available and willing to help!)
So what does this all mean to the music educator? I think there will be many times when students keep trying and trying and don't quite "get it," whether the "it" be playing an instrument or singing or composing or improvising. Also, students usually don't have the resources at home to solve musical problems.
To alleviate this problem, I wonder how music educators can provide their students with resources for solving problems. The resources could be over a myriad of media. I could make practice CDs with examples of right and wrong ways of playing, and examples of interpretation. I could also post those audio files on the internet. Additionally, I would consider being available at a specified time by AIM or Gchat to answer practice questions in real time. (I know it might be really hard to get students to use that; they may not want to practice at that specific time. But I could take a survey to find out the most common practice times and fit my online chat into that time. I might even vary the day from week-to-week to give access to different students each week.)
In short, I can infer that music education doesn't stop when class is over. We put so much responsibility on our students to practice outside of class, but we don't provide them with the tools for successful practice sessions. I hope in the future I will be able to provide examples of successful practicing during class, then make available the resources I mentioned above. Hopefully this will produce students who can use their own know-how to solve musical problems, and know when and how to access available resources when their know-how falls short.
Monday, December 15, 2008
Tuesday, December 2, 2008
New Music Ensemble Concert
Last Monday I performed in the New Music Ensemble concert. It was a challenging class for me. For a long time I had a hard time accepting "new music" as Music. Although I was excited to explore the vast range of extended technique on the flute, for a while I felt I was doing little more than making a bunch of cool sounds. I had to find ways to make music out of this new sound vocabulary. I interacted with the other musicians as in a dialogue, I tried to make a beginning, middle, and end to my phrases and long-term improvisations. It was very difficult to see this as Music, though, since it didn't resemble any Music I'm used to listening to. I think if I spend more time listening to and performing new music I will understand the musicianship involved. Just like any type of Music, one will appreciate it most when one is a part of that music-making for an extended period of time. Over time one will learn the nuances of that Music that make it Music. In the meantime, though, I am glad for the experience and how it stretched my musical horizons.
There are a few aspects of the show I want to discuss. First of all, it was very theatrical. Prior to this concert, the only time I've seen lighting and video used at a musical performance is for a musical theater production. In the NME concert a few pieces used video, and most of them included lighting effects. A few of the videos were fixed media, like the audio tapes, and one was designed to react to the sounds we made on stage. However, we didn't interact with the video like we did with the tapes. Of course they were projected behind us, so we couldn't see them. But I wonder if we would have had a more cohesive performance if there had been musical responses to the fixed videos.
Another thing that interested me was the correlation I noticed between the improvised dancing and the improvised playing. The dancers seemed to use different parts of their bodies in as many ways as they could think of, and emphasized different movements at different times. Really, this is exactly what we were doing with our instruments. We used our knowledge of the sound repertoire to create an improvisation, and used as much of the instruments' capabilities as possible. We were instructed to respond to what we heard in the tapes, down to the smallest pitch or squeak. I could tell the dancers were responding to the general mood of the music, but I don't know if they were instructed to physically respond to every individual sound from the tapes. (Additionally, dancer-musician interactions could have made a more cohesive performance, but we didn't have enough rehearsal time together to work on dancer-musician improvisation.)
Because the dance moves were so exploratory like the musical elements, I think new music is a very difficult genre to bring to high school musicians. The dance could be seen as very silly if one doesn't expect those movements. And although I see the importance of introducing young students to all genres of music, especially through live performance, I think a large group of high schoolers would not be able to handle that performance. I think the mixed media would be appealing to them, but I'd be worried that they would not be able to take the whole performance seriously. I would love to find a way to introduce students to new music; it may be a genre they really like, and maybe they would someday contribute to the field! But I would definitely not make a field trip of the NME concert without some serious preparation beforehand.
There are a few aspects of the show I want to discuss. First of all, it was very theatrical. Prior to this concert, the only time I've seen lighting and video used at a musical performance is for a musical theater production. In the NME concert a few pieces used video, and most of them included lighting effects. A few of the videos were fixed media, like the audio tapes, and one was designed to react to the sounds we made on stage. However, we didn't interact with the video like we did with the tapes. Of course they were projected behind us, so we couldn't see them. But I wonder if we would have had a more cohesive performance if there had been musical responses to the fixed videos.
Another thing that interested me was the correlation I noticed between the improvised dancing and the improvised playing. The dancers seemed to use different parts of their bodies in as many ways as they could think of, and emphasized different movements at different times. Really, this is exactly what we were doing with our instruments. We used our knowledge of the sound repertoire to create an improvisation, and used as much of the instruments' capabilities as possible. We were instructed to respond to what we heard in the tapes, down to the smallest pitch or squeak. I could tell the dancers were responding to the general mood of the music, but I don't know if they were instructed to physically respond to every individual sound from the tapes. (Additionally, dancer-musician interactions could have made a more cohesive performance, but we didn't have enough rehearsal time together to work on dancer-musician improvisation.)
Because the dance moves were so exploratory like the musical elements, I think new music is a very difficult genre to bring to high school musicians. The dance could be seen as very silly if one doesn't expect those movements. And although I see the importance of introducing young students to all genres of music, especially through live performance, I think a large group of high schoolers would not be able to handle that performance. I think the mixed media would be appealing to them, but I'd be worried that they would not be able to take the whole performance seriously. I would love to find a way to introduce students to new music; it may be a genre they really like, and maybe they would someday contribute to the field! But I would definitely not make a field trip of the NME concert without some serious preparation beforehand.
Monday, November 24, 2008
Music Notation Software in the Classroom: True Music-Making?
When I was growing up, computers had black screens and green text, and were only accessible to the class when we earned a field trip to the computer lab to play Oregon Trail. It never crossed my mind- or my music teachers' minds, certainly!- that computers could be used in a music classroom!
This comes to mind because of a recent class discussion about teaching composition. I'll get to the specifics of the discussion in a moment.
When someone says "music technology in the classroom," one's default mental image is probably either recording technology, or notation software that every student can use. Finale even makes a barebones version, Notepad, that I encountered at my job two years ago. Apparently the previous teacher showed the students how to use Notepad, gave them some parameters (maybe), and the students composed. But were they really making music?
The principal was thrilled. She had a whole school full of little Mozarts working on Finale Notepad. She was constantly mentioning to me that I should get the laptops in my classroom to do some composition. However, I thumbed through some of the music portfolios from the previous year, and was not really surprised to see the low level of musicianship these compositions displayed.
I shouldn't judge too hard. After all, these students were only in their first year or two of music class. But it seemed like they were all boxed into the same assignment, with no leeway to incorporate musical elements that were important to them. They all had similar usage of the E Major scale, or whatever it happened to be. It looked like there wasn't any joy in it.
So I wonder: Is there a better way to teach composition? For students who have never encountered music before, perhaps notation is not the place to start. They may feel very restricted, and may have little interest in writing the usual tonal, Western-approved-type of music. Instead, start with something that will appeal to them, and spark their interest and creativity. I'm not sure what that is yet, but I wouldn't want to intimidate them by forcing unfamiliar notation on them, especially in the form of an often-confusing computer program.
After they know how to read music well, and are able to hear what they want to write before they put something down on paper, then students should be introduced to notation software. Until then, take the class set of Finale Notepad downloads out of the budget, and instead get some more instruments repaired.
This comes to mind because of a recent class discussion about teaching composition. I'll get to the specifics of the discussion in a moment.
When someone says "music technology in the classroom," one's default mental image is probably either recording technology, or notation software that every student can use. Finale even makes a barebones version, Notepad, that I encountered at my job two years ago. Apparently the previous teacher showed the students how to use Notepad, gave them some parameters (maybe), and the students composed. But were they really making music?
The principal was thrilled. She had a whole school full of little Mozarts working on Finale Notepad. She was constantly mentioning to me that I should get the laptops in my classroom to do some composition. However, I thumbed through some of the music portfolios from the previous year, and was not really surprised to see the low level of musicianship these compositions displayed.
I shouldn't judge too hard. After all, these students were only in their first year or two of music class. But it seemed like they were all boxed into the same assignment, with no leeway to incorporate musical elements that were important to them. They all had similar usage of the E Major scale, or whatever it happened to be. It looked like there wasn't any joy in it.
So I wonder: Is there a better way to teach composition? For students who have never encountered music before, perhaps notation is not the place to start. They may feel very restricted, and may have little interest in writing the usual tonal, Western-approved-type of music. Instead, start with something that will appeal to them, and spark their interest and creativity. I'm not sure what that is yet, but I wouldn't want to intimidate them by forcing unfamiliar notation on them, especially in the form of an often-confusing computer program.
After they know how to read music well, and are able to hear what they want to write before they put something down on paper, then students should be introduced to notation software. Until then, take the class set of Finale Notepad downloads out of the budget, and instead get some more instruments repaired.
Monday, November 10, 2008
Jazz at Lincoln Center
I was the lucky recipient of a friend's extra ticket to see the fall gala performance at the Rose Theater tonight. Wynton Marsalis performed with his quintet, and George Benson with his entourage of musicians. Ken Burns was also there; he was accepting the Ed Bradley award for leadership.
The reason I'm writing about this in my tech blog is to discuss the very basic argument of real instruments vs. synthesized instruments. Mr. Benson had a rhythm section (including the man who conducts Barbra Streisand's orchestra!), a live string orchestra, a chorus of backup singers, and an overworked synthesizer player. I wondered why he hired every instrument family except for winds and brass; surely a few more players could have been written into the budget? It would have added immensely to my enjoyment.
Mr. Benson was doing a tribute to Nat "King" Cole. His voice is already quite close to Mr. Cole's voice, and the strings' presence brought out the schmaltzy quality of Mr. Cole's signature sound. But as soon as I heard flutes, and didn't see any, I raised an eyebrow; seriously? He's got a synth player for that? It was a very good synthesizer, don't get me wrong; but no professional flute player would choose to keep such a steady, uncreative vibrato for those long notes. The tone was adequate but the expression wasn't there. It detracted from the performance because it seemed like he took an unneeded shortcut.
It was even more apparent when the synth player was doing an entire choir of brass hits. That was obviously inadequate. The sound wasn't present enough, and it was a little too tinny to be believable. I was almost distracted by its wimpiness.
Of course musicians can choose to use a synth player in place of a live performer for budget or space reasons. But Mr. Benson seemed to have both financial and spatial means. I don't know if Mr. Cole skimped on musicians, but my guess is he would have had as many live performers as possible. If Mr. Benson wanted to bolster his sound and his tribute, he should have gone with the real players, even if just for those few measures.
The reason I'm writing about this in my tech blog is to discuss the very basic argument of real instruments vs. synthesized instruments. Mr. Benson had a rhythm section (including the man who conducts Barbra Streisand's orchestra!), a live string orchestra, a chorus of backup singers, and an overworked synthesizer player. I wondered why he hired every instrument family except for winds and brass; surely a few more players could have been written into the budget? It would have added immensely to my enjoyment.
Mr. Benson was doing a tribute to Nat "King" Cole. His voice is already quite close to Mr. Cole's voice, and the strings' presence brought out the schmaltzy quality of Mr. Cole's signature sound. But as soon as I heard flutes, and didn't see any, I raised an eyebrow; seriously? He's got a synth player for that? It was a very good synthesizer, don't get me wrong; but no professional flute player would choose to keep such a steady, uncreative vibrato for those long notes. The tone was adequate but the expression wasn't there. It detracted from the performance because it seemed like he took an unneeded shortcut.
It was even more apparent when the synth player was doing an entire choir of brass hits. That was obviously inadequate. The sound wasn't present enough, and it was a little too tinny to be believable. I was almost distracted by its wimpiness.
Of course musicians can choose to use a synth player in place of a live performer for budget or space reasons. But Mr. Benson seemed to have both financial and spatial means. I don't know if Mr. Cole skimped on musicians, but my guess is he would have had as many live performers as possible. If Mr. Benson wanted to bolster his sound and his tribute, he should have gone with the real players, even if just for those few measures.
Thursday, November 6, 2008
Film Scores
I'm reading The Joy of Music by Leonard Bernstein, and he discusses his experience with On the Waterfront in the chapter "Interlude: Upper Dubbing, CA." ("Upper Dubbing" refers to the building where they put the sound effects and soundtrack into the movie, or "dub in" the sounds.) Bernstein describes the sound editors as geniuses who can listen to many different, seemingly contradictory, commands and somehow create an appropriate mix for the movie.
Bernstein was allowed to attend the editing sessions (a rare privilege for a composer), and added another element to the gathering: the advocate for keeping all of the music in tact. HIS music, in fact; he studied the film to write scene-appropriate music, and wrote complete pieces. Bernstein still included a logical beginning, middle, and end when he composed for the film, and his work is somewhat incomplete if any part, even a single bar, is removed.
Unfortunately for Bernstein, the music is often the first to go. He admits that the best film scores are those that the viewer never really notices. If you notice the music, it has covered up the most important element: the movie. So, when the technician removes the climax of a powerful crescendo to allow Marlon Brando's grunted line to come through, Bernstein can do little more than pout about it.
I found it interesting, though not surprising, that composers will fight for every bar to keep it in the film. I am not a composer, and I don't have the experience of agonizing over every note of a score. Of course Bernstein would want to save every last bar if he could; he labored over that music and put his heart and soul into it. And yes, it would be incomplete if any part were removed. If you removed any part of the exposition of a sonata, wouldn't that be incomplete as well? This all makes sense, but I had never thought about it before. I wonder if that is the case with all film scores; is there more to the soundtrack that we're not hearing, and is the composer devastated to find those parts missing when he watches the film?
Bernstein was allowed to attend the editing sessions (a rare privilege for a composer), and added another element to the gathering: the advocate for keeping all of the music in tact. HIS music, in fact; he studied the film to write scene-appropriate music, and wrote complete pieces. Bernstein still included a logical beginning, middle, and end when he composed for the film, and his work is somewhat incomplete if any part, even a single bar, is removed.
Unfortunately for Bernstein, the music is often the first to go. He admits that the best film scores are those that the viewer never really notices. If you notice the music, it has covered up the most important element: the movie. So, when the technician removes the climax of a powerful crescendo to allow Marlon Brando's grunted line to come through, Bernstein can do little more than pout about it.
I found it interesting, though not surprising, that composers will fight for every bar to keep it in the film. I am not a composer, and I don't have the experience of agonizing over every note of a score. Of course Bernstein would want to save every last bar if he could; he labored over that music and put his heart and soul into it. And yes, it would be incomplete if any part were removed. If you removed any part of the exposition of a sonata, wouldn't that be incomplete as well? This all makes sense, but I had never thought about it before. I wonder if that is the case with all film scores; is there more to the soundtrack that we're not hearing, and is the composer devastated to find those parts missing when he watches the film?
Tuesday, October 28, 2008
Sound Effects
Partners in Rhyme came through on their offer of free sound effects! They were more timely about it than I am about posting, though; I received the email only a day after I sent them the link to my blog. So, thanks, Partners in Rhyme, for the hours of free sound effects!
Monday, October 27, 2008
Musical Expression in Electronic Music
I attended a new music concert tonight at NYU. The performers were Elizabeth McNutt, flute, Dr. Esther Lamneck, clarinet, and... computer?? I've improvised on flute to a fixed electronic score before, but I have never seen a live performance of instrumentalists playing composed music to the improvisations of a computer! I don't quite understand the computer programming, so I'm sure I'm oversimplifying it. But I think the composers designed their computer programs to react to live sounds. And I was most impressed with the last piece, when that composer was on stage "playing" the computer.
It was a fascinating experience. The instrumentalists tonight were of course top-notch. It would have been just as enjoyable to listen to them play together. They varied their tone in extreme ways, yet still blended beautifully. Sometimes I couldn't tell if it was a flute or clarinet tone that I was hearing.
In addition to the live players, the computer contributed so much beyond what I thought was possible. I don't know to what extent the composers controlled the computer sounds, but the computerhad many roles. It filled in the gaps between the instruments' parts. It repeated the instruments' sounds, mixed them together, transposed them, varied them, or made new melodies from them. It was like a sentient participant in the piece.
What really moved me, though, was to see the last composer "play" the computer. My mental image of an electronic composer was not what I saw. I thought he would seem disconnected from the performance; he would merely be there to ensure the computer program does what it's supposed to. But instead, he played the computer as if he were playing another instrument. In his face was the concentration of a musician. His body movements gave away his musical intention: he anticipated entrances and showed phrase endings. He moved quickly and sharply for loud or sudden entrances, and more slowly or deliberately for subtler sounds. There weren't any traditional cadences or harmonic rhythm to clue me in to the form, so watching his body movements helped me understand where the music was going.
This may be a controversial post; is the concert I just saw really a Music concert? A few things I observed tonight pushed me closer to accepting new music as Music. Firstly, it is not all random sounds. Though there are fewer notes as people might recognize from Western classical music, there is a vocabulary of sounds written and used in a distinct way. Secondly, the composers wrote for particular instruments following an expanding tradition of music. It is not neo-Classical, and it is not meant to imitate a more "tonal" tradition. It is meant to expand on what we know as music, which in turn creates a new genre. Lastly, the physical motions that I mentioned above betray musical intention. The musicians on stage were trained in more conventional settings and acquired these movements from that education. They then apply that musicality to this new genre of music.
I have been struggling with the label of "Music" for electronic music, and seeing this concert gets me closer to accepting the label. My interaction with the live performers proved most important in determining a definition of Music.
It was a fascinating experience. The instrumentalists tonight were of course top-notch. It would have been just as enjoyable to listen to them play together. They varied their tone in extreme ways, yet still blended beautifully. Sometimes I couldn't tell if it was a flute or clarinet tone that I was hearing.
In addition to the live players, the computer contributed so much beyond what I thought was possible. I don't know to what extent the composers controlled the computer sounds, but the computerhad many roles. It filled in the gaps between the instruments' parts. It repeated the instruments' sounds, mixed them together, transposed them, varied them, or made new melodies from them. It was like a sentient participant in the piece.
What really moved me, though, was to see the last composer "play" the computer. My mental image of an electronic composer was not what I saw. I thought he would seem disconnected from the performance; he would merely be there to ensure the computer program does what it's supposed to. But instead, he played the computer as if he were playing another instrument. In his face was the concentration of a musician. His body movements gave away his musical intention: he anticipated entrances and showed phrase endings. He moved quickly and sharply for loud or sudden entrances, and more slowly or deliberately for subtler sounds. There weren't any traditional cadences or harmonic rhythm to clue me in to the form, so watching his body movements helped me understand where the music was going.
This may be a controversial post; is the concert I just saw really a Music concert? A few things I observed tonight pushed me closer to accepting new music as Music. Firstly, it is not all random sounds. Though there are fewer notes as people might recognize from Western classical music, there is a vocabulary of sounds written and used in a distinct way. Secondly, the composers wrote for particular instruments following an expanding tradition of music. It is not neo-Classical, and it is not meant to imitate a more "tonal" tradition. It is meant to expand on what we know as music, which in turn creates a new genre. Lastly, the physical motions that I mentioned above betray musical intention. The musicians on stage were trained in more conventional settings and acquired these movements from that education. They then apply that musicality to this new genre of music.
I have been struggling with the label of "Music" for electronic music, and seeing this concert gets me closer to accepting the label. My interaction with the live performers proved most important in determining a definition of Music.
Subscribe to:
Posts (Atom)