<?xml version="1.0" encoding="utf-8"?><feed xmlns="http://www.w3.org/2005/Atom" ><generator uri="https://jekyllrb.com/" version="3.8.5">Jekyll</generator><link href="/iss294-portfolio/feed.xml" rel="self" type="application/atom+xml" /><link href="/iss294-portfolio/" rel="alternate" type="text/html" /><updated>2019-04-29T16:43:50+00:00</updated><id>/iss294-portfolio/feed.xml</id><title type="html">Stone Mathers Portfolio</title><subtitle>This website is meant to showcase my blog posts and works from my Interactive Graphics course.</subtitle><entry><title type="html">Percussive Flow</title><link href="/iss294-portfolio/pieces/2019/04/23/percussive-flow.html" rel="alternate" type="text/html" title="Percussive Flow" /><published>2019-04-23T00:00:00+00:00</published><updated>2019-04-23T00:00:00+00:00</updated><id>/iss294-portfolio/pieces/2019/04/23/percussive-flow</id><content type="html" xml:base="/iss294-portfolio/pieces/2019/04/23/percussive-flow.html">&lt;p&gt;Music visualizers generally fit into two categories: either they are rendered in advance for a specific song, or they react in live time to an audio input. For the latter, they often struggle to accurately convey the emotion of a song beyond its tempo/pulse and volume. This is especially true of a song’s percussion, which is often stripped down to big bass and snare hits. For this piece, I hope to better represent the feelings behind each component of a drumkit, as well as the emotions behind certain styles of playing. It is also important that the piece reacts in live time, as this allows the user to visualize the music as they play it, adjusting their playing as they explore the various emotions that the piece can produce.&lt;/p&gt;

&lt;p&gt;This visualization takes place entirely in Processing.js. I use the Midibus Processing library to interpret the MIDI data received from the TD-17 drum module. Each MIDI note includes a &lt;code class=&quot;highlighter-rouge&quot;&gt;pitch&lt;/code&gt; value and a &lt;code class=&quot;highlighter-rouge&quot;&gt;velocity&lt;/code&gt; value. Every pad on the e-kit has a unique &lt;code class=&quot;highlighter-rouge&quot;&gt;pitch&lt;/code&gt;, such as 36 for the bass drum or 40 for a snare rim shot. The &lt;code class=&quot;highlighter-rouge&quot;&gt;velocity&lt;/code&gt; simply measures how hard the pad was hit on a scale of 0-127.&lt;/p&gt;

&lt;p&gt;For the visualization itself, the background uses the MeshRender class from the &lt;a href=&quot;http://www.objectstothinkwith.com/tracer/&quot;&gt;Tracer&lt;/a&gt; library to create a series of circular paths. Tracers follow these paths behind the scenes and, when two tracers are within a minimum distance of one another, a line is drawn between the two. As the the user plays more notes per second, the minimum distance determining these connections increases. As a result, as note density increases, the MeshRender begins to fill in the background.&lt;/p&gt;

&lt;p&gt;On top of this I am drawing shapes that correspond to different pads throughout the kit, the part of the pad that is hit, and the velocity behind the note. I created the shapes by extending  the &lt;code class=&quot;highlighter-rouge&quot;&gt;Polygon2D&lt;/code&gt; and &lt;code class=&quot;highlighter-rouge&quot;&gt;Ellipse&lt;/code&gt; classes from the &lt;a href=&quot;http://toxiclibs.org&quot;&gt;toxiclibs&lt;/a&gt; library, creating my own &lt;code class=&quot;highlighter-rouge&quot;&gt;DrumPolygon&lt;/code&gt;, &lt;code class=&quot;highlighter-rouge&quot;&gt;SnarePolygon&lt;/code&gt;, &lt;code class=&quot;highlighter-rouge&quot;&gt;DrumEllipse&lt;/code&gt;, and &lt;code class=&quot;highlighter-rouge&quot;&gt;CymbalEllipseStack&lt;/code&gt; classes. All of these classes visualize things in slightly different ways, but share a few characteristics. The size and color of each one, when it is hit, is set relative to the velocity of the note. The size approaches the maximum set size, and the color approaches the maximum set brightness using HSB. The size and brightness then gradually lower with each frame, shrinking the shape back to its initial state.&lt;/p&gt;

&lt;p&gt;Each shape represents the feeling behind the corresponding part of the kit. The polygons in the top corners represent the splashy, resonant nature of crashes. The ellipse on the bottom represents the dark, low tone that bubbles up and quickly dissipates. The snare in the middle deforms further and further as the time between hits increases, as the snare as often the centerpiece of any groove, the backbeat that everybody comes back to, and thus it becomes uncomfortable with long periods of nonuse. The combination of the mathematical mesh in the background and abstract shapes in the foreground represet the quantiative creativity that is percussion.&lt;/p&gt;

&lt;p&gt;Click on the image below to check out a demo of this piece:&lt;/p&gt;

&lt;p&gt;&lt;a href=&quot;http://www.youtube.com/watch?v=Rln4jtj6fJQ&quot; title=&quot;Percussive Flow Demo&quot;&gt;&lt;img src=&quot;http://img.youtube.com/vi/Rln4jtj6fJQ/0.jpg&quot; alt=&quot;Percussive Flow Demo&quot; /&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Explore the &lt;a href=&quot;https://github.com/stonemathers/percussive-flow&quot;&gt;soure code&lt;/a&gt;!&lt;/p&gt;</content><author><name></name></author><summary type="html">Music visualizers generally fit into two categories: either they are rendered in advance for a specific song, or they react in live time to an audio input. For the latter, they often struggle to accurately convey the emotion of a song beyond its tempo/pulse and volume. This is especially true of a song’s percussion, which is often stripped down to big bass and snare hits. For this piece, I hope to better represent the feelings behind each component of a drumkit, as well as the emotions behind certain styles of playing. It is also important that the piece reacts in live time, as this allows the user to visualize the music as they play it, adjusting their playing as they explore the various emotions that the piece can produce.</summary></entry><entry><title type="html">Final Project Proposal</title><link href="/iss294-portfolio/blogs/2019/04/17/final-project-proposal.html" rel="alternate" type="text/html" title="Final Project Proposal" /><published>2019-04-17T00:00:00+00:00</published><updated>2019-04-17T00:00:00+00:00</updated><id>/iss294-portfolio/blogs/2019/04/17/final-project-proposal</id><content type="html" xml:base="/iss294-portfolio/blogs/2019/04/17/final-project-proposal.html">&lt;h1 id=&quot;project-percussive-flow&quot;&gt;Project: Percussive Flow&lt;/h1&gt;

&lt;h3 id=&quot;conceptual-description&quot;&gt;Conceptual Description&lt;/h3&gt;

&lt;p&gt;Music visualizers generally fit into two categories: either they are rendered in advance for a specific song, or they react in live time to an audio input. For the latter, they often struggle to accurately convey the emotion of a song beyond its tempo/pulse and volume. This is especially true of a song’s percussion, which is often stripped down to big bass and snare hits. For this piece, I hope to better represent the feelings behind each component of a drumkit, as well as the emotions behind certain styles of playing. It is also important that the piece reacts in live time, as this allows the user to visualize the music as they play it, adjusting their playing as they explore the various emotions that the piece can produce.&lt;/p&gt;

&lt;h3 id=&quot;interaction-description&quot;&gt;Interaction Description&lt;/h3&gt;

&lt;p&gt;The user will interact with the piece by playing a TD-17KV electronic drumkit, which is connected to the laptop displaying the visualization. If the goal is to allow the audience to see the visualization, such as when I am playing it as a demo, then the visualization will be facing them. Otherwise, if the user is the one exploring the visualization, then the screen will be facing them. Ideally, the laptop will always be facing the user, while also being connected to a projector facing the audience. The intended audience is absolutely anybody who enjoys music or simply hitting things. The piece is incredibly simple to interact with and requires no skills in playing the drumkit to manipulate. However, those who do play may gain a greater appreciation for the emotions conveyed and an ability to create even more complex visualizations. Interaction drives the creation of the visualization, as there would be no emotion conveyed without it. Audience interaction and creative expression are themselves the idea behind the visualization they create.&lt;/p&gt;

&lt;h3 id=&quot;technical-details&quot;&gt;Technical Details&lt;/h3&gt;

&lt;p&gt;This visualization takes place entirely in Processing.js. I use the Midibus Processing library to interpret the MIDI data received from the TD-17 drum module. Each MIDI note includes a &lt;code class=&quot;highlighter-rouge&quot;&gt;pitch&lt;/code&gt; value and a &lt;code class=&quot;highlighter-rouge&quot;&gt;velocity&lt;/code&gt; value. Every pad on the e-kit has a unique &lt;code class=&quot;highlighter-rouge&quot;&gt;pitch&lt;/code&gt;, such as 36 for the bass drum or 40 for a snare rim shot. The &lt;code class=&quot;highlighter-rouge&quot;&gt;velocity&lt;/code&gt; simply measures how hard the pad was hit on a scale of 0-127. For the visualization itself, I plan to explore a variety of Processing libraries as I cycle through the many ideas floating around my head. I am almost certainly settled on using Tracer to create a mesh background that fills in as the note density increases. For the shapes that are produced by each pad, I plan to look at various geometry libraries such as ComputationalGeometry, Culebra, and Toxiclibs.js.&lt;/p&gt;

&lt;p&gt;Because this piece is dependent on having a TD-17 drum module connected to the computer running it, I don’t plan to host it online anywhere. Instead, I will create a well-documented github repository for anyone who wants to download it to their own machine and/or extend my work.&lt;/p&gt;

&lt;p&gt;The code itself is structured very simply. With Midibus, I can quickly set up an object that takes in MIDI input from “TD-17” and does not send output anywhere:&lt;/p&gt;

&lt;div class=&quot;language-javascript highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;&lt;span class=&quot;nx&quot;&gt;MidiBus&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;myBus&lt;/span&gt; &lt;span class=&quot;o&quot;&gt;=&lt;/span&gt; &lt;span class=&quot;k&quot;&gt;new&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;MidiBus&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;(&lt;/span&gt;&lt;span class=&quot;k&quot;&gt;this&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;s2&quot;&gt;&quot;TD-17&quot;&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;o&quot;&gt;-&lt;/span&gt;&lt;span class=&quot;mi&quot;&gt;1&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;Then, everytime the MidiBus receives a new MIDI note, it calls the function &lt;code class=&quot;highlighter-rouge&quot;&gt;noteOn()&lt;/code&gt;. Within this function I set &lt;code class=&quot;highlighter-rouge&quot;&gt;lastPitch&lt;/code&gt; and &lt;code class=&quot;highlighter-rouge&quot;&gt;lastVelocity&lt;/code&gt; to then be used in the next &lt;code class=&quot;highlighter-rouge&quot;&gt;draw()&lt;/code&gt; call:&lt;/p&gt;

&lt;div class=&quot;language-javascript highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;&lt;span class=&quot;k&quot;&gt;void&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;noteOn&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;(&lt;/span&gt;&lt;span class=&quot;kr&quot;&gt;int&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;channel&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;kr&quot;&gt;int&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;pitch&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;kr&quot;&gt;int&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;velocity&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;)&lt;/span&gt; &lt;span class=&quot;p&quot;&gt;{&lt;/span&gt;
  &lt;span class=&quot;nx&quot;&gt;lastPitch&lt;/span&gt; &lt;span class=&quot;o&quot;&gt;=&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;pitch&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;;&lt;/span&gt;
  &lt;span class=&quot;nx&quot;&gt;lastVelocity&lt;/span&gt; &lt;span class=&quot;o&quot;&gt;=&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;velocity&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;;&lt;/span&gt;
&lt;span class=&quot;p&quot;&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;Lastly, I use the &lt;code class=&quot;highlighter-rouge&quot;&gt;draw()&lt;/code&gt; function to update the visualization. This could be based on what was last played, how loudly it was played, how many notes have been played in a certain amount of time, if a pattern is detected, or any other number of variables. In a basic example, I draw a centered square whose color is based on the pad that was last hit and whose size is based on the velocity of the last hit:&lt;/p&gt;

&lt;div class=&quot;language-javascript highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;&lt;span class=&quot;k&quot;&gt;void&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;draw&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;()&lt;/span&gt; &lt;span class=&quot;p&quot;&gt;{&lt;/span&gt;
  &lt;span class=&quot;kr&quot;&gt;float&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;r&lt;/span&gt; &lt;span class=&quot;o&quot;&gt;=&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;map&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;(&lt;/span&gt;&lt;span class=&quot;nx&quot;&gt;lastPitch&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;mi&quot;&gt;22&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;mi&quot;&gt;59&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;mi&quot;&gt;0&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;mi&quot;&gt;255&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;);&lt;/span&gt;
  &lt;span class=&quot;kr&quot;&gt;float&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;g&lt;/span&gt; &lt;span class=&quot;o&quot;&gt;=&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;map&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;(&lt;/span&gt;&lt;span class=&quot;nx&quot;&gt;lastPitch&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;mi&quot;&gt;22&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;mi&quot;&gt;59&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;mi&quot;&gt;0&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;mi&quot;&gt;50&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;);&lt;/span&gt;
  &lt;span class=&quot;kr&quot;&gt;float&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;b&lt;/span&gt; &lt;span class=&quot;o&quot;&gt;=&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;map&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;(&lt;/span&gt;&lt;span class=&quot;nx&quot;&gt;lastPitch&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;mi&quot;&gt;22&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;mi&quot;&gt;59&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;mi&quot;&gt;0&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;mi&quot;&gt;255&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;);&lt;/span&gt;
  &lt;span class=&quot;kr&quot;&gt;float&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;w&lt;/span&gt; &lt;span class=&quot;o&quot;&gt;=&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;map&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;(&lt;/span&gt;&lt;span class=&quot;nx&quot;&gt;lastVelocity&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;mi&quot;&gt;0&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;mi&quot;&gt;127&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;mi&quot;&gt;0&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;width&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;);&lt;/span&gt;
  &lt;span class=&quot;kr&quot;&gt;float&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;h&lt;/span&gt; &lt;span class=&quot;o&quot;&gt;=&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;map&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;(&lt;/span&gt;&lt;span class=&quot;nx&quot;&gt;lastVelocity&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;mi&quot;&gt;0&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;mi&quot;&gt;127&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;mi&quot;&gt;0&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;height&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;);&lt;/span&gt;
  &lt;span class=&quot;nx&quot;&gt;fill&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;(&lt;/span&gt;&lt;span class=&quot;nx&quot;&gt;r&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;g&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;b&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;);&lt;/span&gt;
  &lt;span class=&quot;nx&quot;&gt;rect&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;(&lt;/span&gt;&lt;span class=&quot;nx&quot;&gt;width&lt;/span&gt;&lt;span class=&quot;o&quot;&gt;/&lt;/span&gt;&lt;span class=&quot;mi&quot;&gt;2&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;height&lt;/span&gt;&lt;span class=&quot;o&quot;&gt;/&lt;/span&gt;&lt;span class=&quot;mi&quot;&gt;2&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;w&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;h&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;);&lt;/span&gt;
&lt;span class=&quot;p&quot;&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;In another basic example, I move a square around the screen based on the pitch and velocity of the last hit:&lt;/p&gt;

&lt;div class=&quot;language-javascript highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;&lt;span class=&quot;k&quot;&gt;void&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;draw&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;()&lt;/span&gt; &lt;span class=&quot;p&quot;&gt;{&lt;/span&gt;
  &lt;span class=&quot;nx&quot;&gt;background&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;(&lt;/span&gt;&lt;span class=&quot;mi&quot;&gt;0&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;);&lt;/span&gt;
  &lt;span class=&quot;kr&quot;&gt;float&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;r&lt;/span&gt; &lt;span class=&quot;o&quot;&gt;=&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;map&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;(&lt;/span&gt;&lt;span class=&quot;nx&quot;&gt;globPitch&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;mi&quot;&gt;22&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;mi&quot;&gt;59&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;mi&quot;&gt;0&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;mi&quot;&gt;255&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;);&lt;/span&gt;
  &lt;span class=&quot;kr&quot;&gt;float&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;g&lt;/span&gt; &lt;span class=&quot;o&quot;&gt;=&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;map&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;(&lt;/span&gt;&lt;span class=&quot;nx&quot;&gt;globPitch&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;mi&quot;&gt;22&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;mi&quot;&gt;59&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;mi&quot;&gt;0&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;mi&quot;&gt;50&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;);&lt;/span&gt;
  &lt;span class=&quot;kr&quot;&gt;float&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;b&lt;/span&gt; &lt;span class=&quot;o&quot;&gt;=&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;map&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;(&lt;/span&gt;&lt;span class=&quot;nx&quot;&gt;globPitch&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;mi&quot;&gt;22&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;mi&quot;&gt;59&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;mi&quot;&gt;0&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;mi&quot;&gt;255&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;);&lt;/span&gt;
  &lt;span class=&quot;kr&quot;&gt;float&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;x&lt;/span&gt; &lt;span class=&quot;o&quot;&gt;=&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;map&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;(&lt;/span&gt;&lt;span class=&quot;nx&quot;&gt;globPitch&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;mi&quot;&gt;22&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;mi&quot;&gt;59&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;mi&quot;&gt;0&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;width&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;);&lt;/span&gt;
  &lt;span class=&quot;kr&quot;&gt;float&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;y&lt;/span&gt; &lt;span class=&quot;o&quot;&gt;=&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;map&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;(&lt;/span&gt;&lt;span class=&quot;nx&quot;&gt;globVelocity&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;mi&quot;&gt;0&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;mi&quot;&gt;127&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;mi&quot;&gt;0&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;height&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;);&lt;/span&gt;
  &lt;span class=&quot;nx&quot;&gt;fill&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;(&lt;/span&gt;&lt;span class=&quot;nx&quot;&gt;r&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;g&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;b&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;);&lt;/span&gt;
  &lt;span class=&quot;nx&quot;&gt;rect&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;(&lt;/span&gt;&lt;span class=&quot;nx&quot;&gt;x&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;nx&quot;&gt;y&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;mi&quot;&gt;80&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;mi&quot;&gt;80&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;);&lt;/span&gt;
&lt;span class=&quot;p&quot;&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;All of the code for this project will be held in my github repository &lt;a href=&quot;https://github.com/stonemathers/percussive-flow&quot;&gt;stonemathers/percussive-flow&lt;/a&gt;.&lt;/p&gt;</content><author><name></name></author><summary type="html">Project: Percussive Flow</summary></entry><entry><title type="html">Blog Post - Cybernetics</title><link href="/iss294-portfolio/blogs/2019/04/11/blog-post-cybernetics.html" rel="alternate" type="text/html" title="Blog Post - Cybernetics" /><published>2019-04-11T00:00:00+00:00</published><updated>2019-04-11T00:00:00+00:00</updated><id>/iss294-portfolio/blogs/2019/04/11/blog-post-cybernetics</id><content type="html" xml:base="/iss294-portfolio/blogs/2019/04/11/blog-post-cybernetics.html">&lt;h1 id=&quot;blog-post---cybernetics&quot;&gt;Blog Post - Cybernetics&lt;/h1&gt;

&lt;h3 id=&quot;surface&quot;&gt;Surface&lt;/h3&gt;

&lt;p&gt;&lt;img src=&quot;/iss294-portfolio/images/surface.jpg&quot; alt=&quot;Surface&quot; /&gt;&lt;/p&gt;

&lt;p&gt;Surface was created by cybernetics artist &lt;a href=&quot;https://www.drake-brockman.com.au/index.html&quot;&gt;Geoffrey Drake-Brockman&lt;/a&gt;, who specializes in large-scale public installations. This piece, which is on permanent display in Perth Children’s Hospital, consists of over 2,000 ceiling-mounted LED tubes. This “pond” of lights mimics the motion of water through the changing of color and “flowing” of light along the length of each LED tube. When one of four sensors detects a pedestrain moving below, a virtual stone is throne into the pond, causing ripples to disrupt the prievously calm scene.&lt;/p&gt;

&lt;p&gt;I was drawn to this piece due to its simplicity of concept, yet complexity of output. Additionally, it is visually stunning and soothing all at once. I could easily picture myself walking back and forth at different speeds, seeing how much I could disrupt the digital pool. I could just as easily sit there watching it flow for an hour-long emergency room wait. When I visited the Renwick Gallery in Washington D.C. recently, there was a similar installation above the main staircase. However, that piece was lacking the use of different colors and user interaction, two components that bring Surface to a different level.&lt;/p&gt;

&lt;p&gt;&lt;a href=&quot;https://www.youtube.com/watch?v=hJW_Fi-r68Q&quot;&gt;Check out this video&lt;/a&gt; to see Surface in action.&lt;/p&gt;

&lt;h3 id=&quot;digital-wheel-art&quot;&gt;Digital Wheel Art&lt;/h3&gt;

&lt;p&gt;&lt;img src=&quot;/iss294-portfolio/images/dwa_usertest.jpg&quot; alt=&quot;Digital Wheel Art - User Test&quot; /&gt;&lt;/p&gt;

&lt;p&gt;After realizing that children diagnosed with cerebral palsy were restricted from creative expression, Younghyun Chung designed and built Digital Wheel Art. Using an infrared sensor and a Nintendo Wiimote, Chung’s installation detects a wheelchair’s movement around a room, projecting it on the screen as brush strokes. The user can also tilt their head to change the brush color, allowing for further means of artistic expression. By repurposing cheap, widespread technology, Chung has unlocked a new world of communication through art for people with a wide range of physical disabilities.&lt;/p&gt;

&lt;p&gt;Read more about Digital Wheel Art and watch videos of user tests &lt;a href=&quot;http://risknfun.com/project/digitalwheelart/&quot;&gt;here&lt;/a&gt;.&lt;/p&gt;</content><author><name></name></author><summary type="html">Blog Post - Cybernetics</summary></entry><entry><title type="html">Smoky Mountain Relay Visualizer</title><link href="/iss294-portfolio/pieces/2019/03/26/smoky-mountain-relay-visualizer.html" rel="alternate" type="text/html" title="Smoky Mountain Relay Visualizer" /><published>2019-03-26T00:00:00+00:00</published><updated>2019-03-26T00:00:00+00:00</updated><id>/iss294-portfolio/pieces/2019/03/26/smoky-mountain-relay-visualizer</id><content type="html" xml:base="/iss294-portfolio/pieces/2019/03/26/smoky-mountain-relay-visualizer.html">&lt;p&gt;At the end of April, I will be running the Smoky Mountain Relay with seven other members of Duke Club Running. This 140-mile relay race is comprised of 24 legs that wind throughout Pisgah National Forest. While there are lists of the leg distances, elevation changes, and relative difficulties, I found it hard to wrap my head around the progression of the race and how the legs truly compared to one another. So, I sought to create a playful, yet useful visualization of the race that allows the user to virtually “run” the race. Hopefully this would help the user better understand the difference in difficulty between legs and overall portions of the race, thus allowing for a more informed division of legs between team members. I also hoped to convey just how much of the race takes place during the night.&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/iss294-portfolio/images/race-start.png&quot; alt=&quot;Visualization Start&quot; /&gt;&lt;/p&gt;

&lt;p&gt;Using the provided distance, start elevation, end elevation, and diffiulty measurements provided on the race website, I created a JSON file containing data for all of the legs. Using small graphs provided with the map for each leg, I also estimated the position and elevation of peaks and valleys, thus allowing me to more accurately portray the terrain. These legs are drawn side-by-side, scaled down to fit on the screen. The vertical axis on the left of the screen is dynamically created to fit the range of elavations provided by the JSON file. The horizontal axis on the bottom is similarly created according to the total distance of the race. Therefore, the legs can be updated for future years without requiring new code. The difficulty gauge in the bottom left and the colors used to portray each leg’s difficulty is also created dynamically based on the range of difficulties.&lt;/p&gt;

&lt;p&gt;The user can either use the left and right arrow keys or scroll up and down to traverse through the visualization. As the user progresses along the race, the sky progressively darkens until it is black, remaining so for the approximate portion of the race taking place at night. As the user approaches the end, the sky brightens once again with the rising of the sun. There is also a progress bar at the bottom of the screen to provide positional context within the scope of the entire race.&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/iss294-portfolio/images/race-middle.png&quot; alt=&quot;Visualization Middle&quot; /&gt;&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/iss294-portfolio/images/race-end.png&quot; alt=&quot;Visualization End&quot; /&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href=&quot;https://stonemathers.github.io/smr-legs/&quot;&gt;Explore this visualization&lt;/a&gt; or &lt;a href=&quot;https://github.com/stonemathers/smr-legs&quot;&gt;peruse the source code&lt;/a&gt;!&lt;/p&gt;</content><author><name></name></author><summary type="html">At the end of April, I will be running the Smoky Mountain Relay with seven other members of Duke Club Running. This 140-mile relay race is comprised of 24 legs that wind throughout Pisgah National Forest. While there are lists of the leg distances, elevation changes, and relative difficulties, I found it hard to wrap my head around the progression of the race and how the legs truly compared to one another. So, I sought to create a playful, yet useful visualization of the race that allows the user to virtually “run” the race. Hopefully this would help the user better understand the difference in difficulty between legs and overall portions of the race, thus allowing for a more informed division of legs between team members. I also hoped to convey just how much of the race takes place during the night.</summary></entry><entry><title type="html">Blog Post - Data Visualization</title><link href="/iss294-portfolio/blogs/2019/02/28/blog-post-data-visualization.html" rel="alternate" type="text/html" title="Blog Post - Data Visualization" /><published>2019-02-28T00:00:00+00:00</published><updated>2019-02-28T00:00:00+00:00</updated><id>/iss294-portfolio/blogs/2019/02/28/blog-post-data-visualization</id><content type="html" xml:base="/iss294-portfolio/blogs/2019/02/28/blog-post-data-visualization.html">&lt;h2 id=&quot;creative-routines&quot;&gt;Creative Routines&lt;/h2&gt;

&lt;p&gt;&lt;img src=&quot;/iss294-portfolio/images/creative-routines.png&quot; alt=&quot;Creative Routines Graphic&quot; /&gt;&lt;/p&gt;

&lt;p&gt;Designed by RJ Andrews, this piece visualizes the daily routines of sixteen great creative minds from the 18th, 19th, and 20th centuries. Its success starts with its use of interesting and unique data. It provides a perspective on important figures that is rarely discussed, let alone compiled into a digestable format. This clarity is its second strength, as it is easy to gather a general understanding of the data with a quick glance. Activities are broken up into a few general categories - enough to give a good idea of how time was being spent, without becoming too specified. For a great depth, however, descriptions are provided next to nearly every block of time. These layers are accessible, but do not get in the way of the bigger picture, allowing the viewer to slowly work their way to a more detailed understanding of the visualization.&lt;/p&gt;

&lt;p&gt;You can read more about the Creative Routines visualization &lt;a href=&quot;https://infowetrust.com/creative-routines/&quot;&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;h2 id=&quot;speed-comparison-chart&quot;&gt;Speed Comparison Chart&lt;/h2&gt;

&lt;p&gt;&lt;img src=&quot;/iss294-portfolio/images/speeds.jpeg&quot; alt=&quot;Speed Comparison Chart&quot; /&gt;&lt;/p&gt;

&lt;p&gt;The sleek color scheme originally drew me to Jim Kynvin’s Speed Comparison Chart. The dark background allows the graphic’s primary information to truly pop, while also concealing further secondary information. This prevents the chart from becoming cluttered, while also providing more data for the viewer to discover. I personally found the use of triangles, instead of the typical rectangles, to convey relatvie speeds to be highly effective. It accentuates the high-velocity nature of the data, giving the illusion that the depicted vehicles (and one bird) are flying from the base at progressively faster speeds. The progressions of dark to warm colors as the speeds increase creates a similar effect.&lt;/p&gt;

&lt;p&gt;See more of Jim’s visualizations &lt;a href=&quot;http://isotype.co.uk&quot;&gt;here&lt;/a&gt;.&lt;/p&gt;</content><author><name></name></author><summary type="html">Creative Routines</summary></entry><entry><title type="html">Algorithmic Fractals</title><link href="/iss294-portfolio/pieces/2019/02/19/algorithmic-fractals.html" rel="alternate" type="text/html" title="Algorithmic Fractals" /><published>2019-02-19T00:00:00+00:00</published><updated>2019-02-19T00:00:00+00:00</updated><id>/iss294-portfolio/pieces/2019/02/19/algorithmic-fractals</id><content type="html" xml:base="/iss294-portfolio/pieces/2019/02/19/algorithmic-fractals.html">&lt;p&gt;&lt;img src=&quot;/iss294-portfolio/images/fractals1.png&quot; alt=&quot;Fractals Piece 1&quot; /&gt;&lt;/p&gt;

&lt;p&gt;The idea for this piece was inspired by LIA’s Tentasho, which I discuss in &lt;a href=&quot;/blogs/2019/01/22/blog-post-new-media-artists-ii.html&quot;&gt;Blog Post - New Media Artists II&lt;/a&gt;. Of course, my version is quite simplified, but still uses the interplay of a flowing brushes and emerging fractals. Unlike LIA’s piece, I wanted to add a level of interactivity with the fractals once they are created. In addition, I play with color blend properties to allow users to easily create intricate and stunning visuals. At any point, the user can claim they are finished, or with a few simple inputs create an entirely different piece.&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/iss294-portfolio/images/fractals2.png&quot; alt=&quot;Fractals Piece 2&quot; /&gt;&lt;/p&gt;

&lt;p&gt;To interact with this piece, click and drag in empty space to begin drawing. On release, a fractal is drawn. This fractal can then be used as a “brush”, either dragging it around the screen, or rotating it by holding the ‘a’ key or space bar while dragging. The ‘a’ key will slowly rotate the fractal, while the space bar will cause a much more sporadic rotation. The ‘r’ key can be pressed to reset the canvas.&lt;/p&gt;

&lt;p&gt;Try it out &lt;a href=&quot;https://stonemathers.github.io/iss294-algorithms/&quot;&gt;here&lt;/a&gt; or give the &lt;a href=&quot;https://github.com/stonemathers/iss294-algorithms&quot;&gt;source code&lt;/a&gt; a look!&lt;/p&gt;</content><author><name></name></author><summary type="html"></summary></entry><entry><title type="html">Blog Post - Algorithmic Art</title><link href="/iss294-portfolio/blogs/2019/02/07/blog-post-algorithmic-art.html" rel="alternate" type="text/html" title="Blog Post - Algorithmic Art" /><published>2019-02-07T00:00:00+00:00</published><updated>2019-02-07T00:00:00+00:00</updated><id>/iss294-portfolio/blogs/2019/02/07/blog-post-algorithmic-art</id><content type="html" xml:base="/iss294-portfolio/blogs/2019/02/07/blog-post-algorithmic-art.html">&lt;h2 id=&quot;the-cave-of-rebirth&quot;&gt;The Cave of Rebirth&lt;/h2&gt;

&lt;p&gt;&lt;img src=&quot;/iss294-portfolio/images/CaveOfRebirth_Piano.png&quot; alt=&quot;Cave of Rebirth with Piano&quot; /&gt;&lt;/p&gt;

&lt;p&gt;The Cave of Rebirth is a song written and performed by Armenian jazz pianist Tigran Hamasyan. For the accompanying music video, visual artist Julius Horsthuis created a series of algorthmically generated and animated 3D fractals. This natural, yet otherworldly aesthetic perfectly accompanies the trance-like music of Hamasyan, which itself sounds ethereal but human. I was drawn to this piece for, while the visuals do follow the fluctuating emotions of the music, they were clearly not precisely planned. While certain paramaters can be entered to guide the visuals towards a certain feeling, it is still an act of exploration and discovery.&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/iss294-portfolio/images/CaveOfRebirth.png&quot; alt=&quot;Cave of Rebirth&quot; /&gt;&lt;/p&gt;

&lt;p&gt;You can read more about the motivations behind the work &lt;a href=&quot;https://www.vice.com/en_us/article/53q75k/tigran-hamasyan-solo-jazz-fractal-universe-music-video&quot;&gt;here&lt;/a&gt;, but to truly appreciate it you have to &lt;a href=&quot;https://www.youtube.com/watch?v=KtMDfBPghgE&quot;&gt;see it in motion&lt;/a&gt;.&lt;/p&gt;

&lt;h2 id=&quot;helen-alexandra&quot;&gt;Helen Alexandra&lt;/h2&gt;

&lt;p&gt;&lt;img src=&quot;/iss294-portfolio/images/alexandra_acrylic.jpg&quot; alt=&quot;Helen Alexandra - Induction Lodge&quot; /&gt;&lt;/p&gt;

&lt;p&gt;Helen Alexandra uses custom generative software and traditional physical media to blur the lines between digital and physical art. Helen’s software renders animations in real time, which she captures at particularly moving movements. She then takes this still and digitally paints over it, blending the algorithmic and human generated aspects. She then prints this on paper or canvas and draws on it with acrylic or ink. Helen’s process creates a truly unique look, as it combines features from three different genres into one of its own.&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/iss294-portfolio/images/alexandra_ink.jpg&quot; alt=&quot;Helen Alexandra Ink&quot; /&gt;&lt;/p&gt;

&lt;p&gt;Read more about Helen’s work &lt;a href=&quot;https://www.vice.com/en_us/article/78e5qb/generative-paintings-acrylic-algorithms-helen-alexandra&quot;&gt;here&lt;/a&gt;.&lt;/p&gt;</content><author><name></name></author><summary type="html">The Cave of Rebirth</summary></entry><entry><title type="html">#typeaway</title><link href="/iss294-portfolio/pieces/2019/02/02/typeaway.html" rel="alternate" type="text/html" title="#typeaway" /><published>2019-02-02T00:00:00+00:00</published><updated>2019-02-02T00:00:00+00:00</updated><id>/iss294-portfolio/pieces/2019/02/02/typeaway</id><content type="html" xml:base="/iss294-portfolio/pieces/2019/02/02/typeaway.html">&lt;p&gt;&lt;img src=&quot;/iss294-portfolio/images/typeaway-title.png&quot; alt=&quot;Title Screen&quot; /&gt;&lt;/p&gt;

&lt;p&gt;For this piece, I wanted to maintain the heavily interactive component of my first sketch. I felt that the mouse movement of the first sketch made interaction feel less natural, so I removed that aspect, instead positioning letters randomly. I wanted to emphasize more the idea of streams of information flying across the screen. Thus, I display the letters in a line. To maintain the feelings of confusion and disorientation from the first sketch, I randomized the size, speed, color, direction, and font of the displayed letters. Even the font of the title screen (added to hint at the piece’s necessary interaction) is randomized with each page load. For the fonts, I wanted to pick a diverse set of fonts that mimic the different sources of information that flood our screens daily. I also wanted some of the to be reminiscent of pop culture such as video games, comic books, and samurai movies. For the colors, I decided to no longer entirely randomize them. Instead, each letter gets one color that it maintains as it slowly fades out. This fixed the potentially seizure-inducing aesthetic of the first sketch, but kept bright colors that stood out against the black landscape, void of information.&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/iss294-portfolio/images/typeaway-busy.png&quot; alt=&quot;Busy Screen&quot; /&gt;&lt;/p&gt;

&lt;p&gt;Try this sketch out &lt;a href=&quot;https://stonemathers.github.io/iss294-helloworld2/&quot;&gt;here&lt;/a&gt; or check out the source code &lt;a href=&quot;https://github.com/stonemathers/iss294-helloworld2&quot;&gt;here&lt;/a&gt;!&lt;/p&gt;</content><author><name></name></author><summary type="html"></summary></entry><entry><title type="html">Blog Post - Sketch Reflection</title><link href="/iss294-portfolio/blogs/2019/01/29/blog-post-sketch-reflection.html" rel="alternate" type="text/html" title="Blog Post - Sketch Reflection" /><published>2019-01-29T00:00:00+00:00</published><updated>2019-01-29T00:00:00+00:00</updated><id>/iss294-portfolio/blogs/2019/01/29/blog-post-sketch-reflection</id><content type="html" xml:base="/iss294-portfolio/blogs/2019/01/29/blog-post-sketch-reflection.html">&lt;h2 id=&quot;reflection&quot;&gt;Reflection&lt;/h2&gt;
&lt;p&gt;My sketch, while approached as a fun and striking interactive piece, ended up representing some of the issues surrounding today’s dissemination of information online, such as information overload and an increasingly short news cycle. To do so, I used only letters, numbers, and symbols that are typed by the user. For each frame, I randomly assign a color to every character drawn on the screen to help emphasize the hectic and overwhelming nature of information online. This piece is intended for anybody who uses the internet, especially social media, as their primary source for news and information. I was likely driven to the fun and interactive side of this piece by a past of playing games online, many of which were meant to be visually overwhelming. The underlying messages sprouted from both a history of using social media for news myself and from several courses that I have taken at Duke, such as Text Mining &amp;amp; Meaning, which have highlighted the inherent issues of information overload and internet sources.&lt;/p&gt;

&lt;h2 id=&quot;what-is-art&quot;&gt;What is Art?&lt;/h2&gt;
&lt;p&gt;The level of tedium and repetition involved in Wolfgang Laib’s work with pollen was particularly perplexing. Yet, when he explained his approach and reasoning for his work, I began to understand. It isn’t just the solitude that has drawn him to spend decades slowly collecting jars of pollen. For him, the repetition brings stability to a world in which change seems to be the only constant.&lt;/p&gt;

&lt;p&gt;In making my sketch, I related to the mindset of Hillary Lloyd. She did not seem to go into any of works with a particular meaning or overarching goal in mind. She seemed more interested in making works for their own sake and the uniqueness of that individual moment. Thus, she never edited her short films. Similarly, I began my piece without any particular goal or direction in mind.&lt;/p&gt;</content><author><name></name></author><summary type="html">Reflection My sketch, while approached as a fun and striking interactive piece, ended up representing some of the issues surrounding today’s dissemination of information online, such as information overload and an increasingly short news cycle. To do so, I used only letters, numbers, and symbols that are typed by the user. For each frame, I randomly assign a color to every character drawn on the screen to help emphasize the hectic and overwhelming nature of information online. This piece is intended for anybody who uses the internet, especially social media, as their primary source for news and information. I was likely driven to the fun and interactive side of this piece by a past of playing games online, many of which were meant to be visually overwhelming. The underlying messages sprouted from both a history of using social media for news myself and from several courses that I have taken at Duke, such as Text Mining &amp;amp; Meaning, which have highlighted the inherent issues of information overload and internet sources.</summary></entry><entry><title type="html">Hello World</title><link href="/iss294-portfolio/pieces/2019/01/27/hello-world.html" rel="alternate" type="text/html" title="Hello World" /><published>2019-01-27T00:00:00+00:00</published><updated>2019-01-27T00:00:00+00:00</updated><id>/iss294-portfolio/pieces/2019/01/27/hello-world</id><content type="html" xml:base="/iss294-portfolio/pieces/2019/01/27/hello-world.html">&lt;p&gt;This is my first sketch for ISS294 - Interactive Graphics. I first set out to play with text, animation, and randomization. I landed on this interactive and visually sporadic sketch. The user is meant to interact with the sketch by moving their mouse around the canvas and typing. Whatever letters, numbers, or symbols they type will then spring from the cursor, moving pseudo-randomly as they decay in size, randomly changing colors until they fade to nothing.&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/iss294-portfolio/images/helloworld1.png&quot; alt=&quot;Example 1&quot; /&gt;&lt;/p&gt;

&lt;p&gt;There are several ideas being portrayed by this sketch, all of which pertain to the dissemination of information across social media. With the speed at which information now travels, breaking news rapidly makes a large impact, but just as quickly fades out of the spotlight as the public moves to the next popular topic. Additionally, as information spreads, it often becomes distorted, either becoming twisted to fit a new perspective or losing its original meaning entirely. Lastly, with the ease of spreading user-generated content, important ideas can easily be lost amongst the overwhelming amount of information that is available.&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/iss294-portfolio/images/helloworld2.png&quot; alt=&quot;Example 2&quot; /&gt;&lt;/p&gt;

&lt;p&gt;Check out the sketch &lt;a href=&quot;https://stonemathers.github.io/iss294-helloworld-ext/&quot;&gt;here&lt;/a&gt; or look at the &lt;a href=&quot;https://github.com/stonemathers/iss294-helloworld-ext&quot;&gt;source code&lt;/a&gt;!&lt;/p&gt;</content><author><name></name></author><summary type="html">This is my first sketch for ISS294 - Interactive Graphics. I first set out to play with text, animation, and randomization. I landed on this interactive and visually sporadic sketch. The user is meant to interact with the sketch by moving their mouse around the canvas and typing. Whatever letters, numbers, or symbols they type will then spring from the cursor, moving pseudo-randomly as they decay in size, randomly changing colors until they fade to nothing.</summary></entry></feed>