From: equilet <2237372+equilet@users.noreply.github.com>
Date: Sat, 31 Dec 2022 03:58:50 +0000 (-0800)
Subject: updated posts december 2022
X-Git-Url: https://git.dabkitsch.com/?a=commitdiff_plain;h=542b23257f082bbd71602e4d5b4b13847c7f4cd7;p=jekyll-site.git
updated posts december 2022
---
diff --git a/_posts/.DS_Store b/_posts/.DS_Store
index da7f287..26566bf 100644
Binary files a/_posts/.DS_Store and b/_posts/.DS_Store differ
diff --git a/_posts/academia/.DS_Store b/_posts/academia/.DS_Store
index 8cbd753..3e1e530 100644
Binary files a/_posts/academia/.DS_Store and b/_posts/academia/.DS_Store differ
diff --git a/_posts/academia/development/2018-04-24-cnmat-externals.md b/_posts/academia/development/2018-04-24-cnmat-externals.md
index 70fdab1..276262d 100644
--- a/_posts/academia/development/2018-04-24-cnmat-externals.md
+++ b/_posts/academia/development/2018-04-24-cnmat-externals.md
@@ -1,19 +1,17 @@
---
layout: project
-title: "CNMAT Externals Release"
+title: CNMAT Externals Release
date: 2018-04-24
-categories: academia development
+categories: software
---
-# CNMAT Externals Release
-
During my time at CNMAT, I updated and maintained the build system, documentation space, and supporting patchers for the CNMAT Externals library. This release included 64-bit Windows compatibility as well as a full documentation overhaul.

Background: The CNMAT externals are a set of externals for Max/MSP that represent various projects alongside CNMAT's research history. This history included many toolchains uninvolved in Max, but as much of the realtime synthesis/processing, performance-oriented software, and prototyping occurred within the Max environment, this set of objects is significant to understanding the history of R&D there.
-CNMAT announcement: "We are pleased to announce an update to our legacy CNMAT Externals package. This update includes a host of new features, including: - 64-bit compatibility - Updated help files and reference links - Many fixed bugs and updates (see log) - A new set of documentation reference files that accompany the help patchers Special thanks to Rama Gottfried and Jeff Lubow for all of their work on the 64-bit build process, testing, fixes and tweaks, documentation updates, and a host of upcoming features. Thanks also to Edmund Campion and Jeremy Wagner for their assistance in the testing phase. This work is made possible with support from the College of Letters and Science and the Department of Music at the University of California, Berkeley. The CNMAT Externals were conceived of / written by a number of researchers and staff throughout the years of CNMATâs history, including Adrian Freed, Matt Wright, John MacCallum, Rama Gottfried, Jeff Lubow, Andy Schmeder, David Wessel, Ben Jacobs and others."
+CNMAT announcement: `"We are pleased to announce an update to our legacy CNMAT Externals package. This update includes a host of new features, including: - 64-bit compatibility - Updated help files and reference links - Many fixed bugs and updates (see log) - A new set of documentation reference files that accompany the help patchers Special thanks to Rama Gottfried and Jeff Lubow for all of their work on the 64-bit build process, testing, fixes and tweaks, documentation updates, and a host of upcoming features. Thanks also to Edmund Campion and Jeremy Wagner for their assistance in the testing phase. This work is made possible with support from the College of Letters and Science and the Department of Music at the University of California, Berkeley. The CNMAT Externals were conceived of / written by a number of researchers and staff throughout the years of CNMATâs history, including Adrian Freed, Matt Wright, John MacCallum, Rama Gottfried, Jeff Lubow, Andy Schmeder, David Wessel, Ben Jacobs and others."`
[announcement link](https://cycling74.com/articles/content-you-need-cnmat-externals-update)
diff --git a/_posts/academia/pedagogy/2007-07-14-mmjss-ta.md b/_posts/academia/pedagogy/2007-07-14-mmjss-ta.md
index ac92d23..b765c35 100644
--- a/_posts/academia/pedagogy/2007-07-14-mmjss-ta.md
+++ b/_posts/academia/pedagogy/2007-07-14-mmjss-ta.md
@@ -1,18 +1,16 @@
---
layout: project
-title: "MMJ Summer School"
+title: "MMJ Summer School TA"
date: 2007-07-14
-categories: academia pedagogy
+categories: pedagogy
---
-# MMJ Day School w/ Michael Zbyszynski
-
-
-
----
+## MMJ Day School w/ Michael Zbyszynski
I assisted MZED with his course at CNMAT for the years of 2007, 2008, 2009, and 2010.
+
+
Topics included:
- Max Basics
diff --git a/_posts/academia/pedagogy/2011-07-18-mmjss.md b/_posts/academia/pedagogy/2011-07-18-mmjss.md
index 79ab828..743913e 100644
--- a/_posts/academia/pedagogy/2011-07-18-mmjss.md
+++ b/_posts/academia/pedagogy/2011-07-18-mmjss.md
@@ -1,16 +1,14 @@
---
-layout: project
-title: "MMJ Summer School"
+layout: default
+title: "MMJ Summer Course"
date: 2011-07-18
-categories: academia pedagogy
+categories: pedagogy
---
# MMJ Summer Course at CNMAT

----
-
I taught the Max/MSP/Jitter workshop with John MacCallum at CNMAT for the years of 2011, 2012, 2013, 2014, 2015.
All curriculum for each year was revamped, as we experimented with differing methods of pedagogy.
diff --git a/_posts/academia/pedagogy/2017-08-13-odot-immersion.md b/_posts/academia/pedagogy/2017-08-13-odot-immersion.md
index c114452..6d15a19 100644
--- a/_posts/academia/pedagogy/2017-08-13-odot-immersion.md
+++ b/_posts/academia/pedagogy/2017-08-13-odot-immersion.md
@@ -1,8 +1,8 @@
---
-layout: project
+layout: default
title: "o. Immersion Course"
date: 2017-08-13
-categories: academia pedagogy
+categories: pedagogy
---
# ODOT Immersion Course
@@ -11,7 +11,7 @@ This course guided students through an introduction into the odot (o.) programmi
Participants constructed sophisticated patches with an emphasis on interaction and exploration, while learning strategies and patterns for managing complexity, debugging, testing, and robustness. The course is tailored to artists and developers working with time-based media.
-
+
Background: odot provides a container, the "odot bundle", an extension of Open Sound Control (OSC), for naming, aggregating, and structuring data. In the Max programming environment, this is supported by a number of externals and patches for creating, manipulating, and displaying these structures. Leveraging this toolkit, students explored the design of complex, real-world patches, with an emphasis on real-time processing.
diff --git a/_posts/academia/pedagogy/2018-01-17-cpma-year01.md b/_posts/academia/pedagogy/2018-01-17-cpma-year01.md
index ce4f0c1..a687d13 100644
--- a/_posts/academia/pedagogy/2018-01-17-cpma-year01.md
+++ b/_posts/academia/pedagogy/2018-01-17-cpma-year01.md
@@ -1,8 +1,8 @@
---
-layout: project
+layout: default
title: "Music 159: CPMA"
date: 2018-01-17
-categories: academia pedagogy
+categories: pedagogy
---
# Computer Programming for Music Applications
diff --git a/_posts/academia/research/2008-09-06-rgb-psm.md b/_posts/academia/research/2008-09-06-rgb-psm.md
index f3dfc33..bcb37d2 100644
--- a/_posts/academia/research/2008-09-06-rgb-psm.md
+++ b/_posts/academia/research/2008-09-06-rgb-psm.md
@@ -1,12 +1,11 @@
---
layout: project
-title: "RGB PS Monome"
+title: RGB pressure sensitive Monome
date: 2008-09-06
categories: research instrument
---
-# RGB pressure sensitive Monome
-
+---
One of my first projects at [CNMAT](cnmat.berkeley.edu) was a collaboration with Adrian Freed. Our goal was to build upon ideas present in the Monome, which has the limitation of on/off status for button interactions.
One way to circumvent this is to augment button toggles with pressure so that you simultaneously have pressure in one dimension, and a button with LED feedback in another. This idea can also be seen captured in newer devices like the Keith McMillen Instruments QuNeo, which has a similar topology, but implemented with MIDI. Instead of leaving the colorspace at a single LED, we also wanted each LED to have a uniquely addressable/showable color. Essentially the project became an RGB pressure sensitive Monome. Adrian had already worked out the tricky bits with resistive fabrics he had been experimenting with, and asked me to get the multiplexing and Arduino code set up. This project led into a research paper about the various technics and affordances beyond a standard Monome device.
diff --git a/_posts/academia/research/2008-10-13-ssr.md b/_posts/academia/research/2008-10-13-ssr.md
index 4a44f04..1ad971c 100644
--- a/_posts/academia/research/2008-10-13-ssr.md
+++ b/_posts/academia/research/2008-10-13-ssr.md
@@ -1,13 +1,16 @@
---
-layout: job
+layout: project
title: "Subjective Spaces in Hearing"
-date: 2008-10-13
-categories: research hearing-science
+categories: research
+images:
+ - path: /assets/starkey02.jpg
+ title: starkey 02
+ - path: /assets/starkey01.png
+ title: starkey 01
---
-# CNMAT research with SHRC (Starkey Hearing Research Laboratories)
-
---
+{% include gallery.html %}
In 2008, I started work with [CNMAT](http://cnmat.berkeley.edu/) on a research project with the late David Wessel, involving the development of hearing aids. Other PIs were Brent Edwards, Sridhar Kalluri, and Kelly Fitz. Subjects were able to use our test suite to custom calibrate a 16-band compressor in order to come up with weighted interpolations that made more sense to them than a standard preset. The goal was to put the hearing impaired in control rather than administering fittings for customers without feedback. We then compared these choices with the outcome of a neural network in order to do further research and analyses on the data. This work was done in Max/MSP, and the UI had to be used by people with no knowledge of programming.
diff --git a/_posts/academia/research/2008-11-01-CNMAT.md b/_posts/academia/research/2008-11-01-CNMAT.md
index 6561d28..61ad949 100644
--- a/_posts/academia/research/2008-11-01-CNMAT.md
+++ b/_posts/academia/research/2008-11-01-CNMAT.md
@@ -2,10 +2,13 @@
layout: job
title: "CNMAT"
date: 2008-11-01
-categories: academia research
+categories: job
---
# CNMAT
+## Date Range: Nov 2008 - Present
+
+---
My responsibilities at the Center for New Music and Audio Technologies have been to build software and systems for novel instruments, new media applications, and to work with composers towards new tools they can use in their compositions. Recently Iâve also been maintaining our internal codebase, including legacy objects, build systems, scripts, and documentation. These projects include abstractions, externals, and supporting code that people use for projects both in research and pedagogy. In order to fulfill CNMATâs development requirements, I frequently use programming languages such as Max/MSP/Jitter alongside the IRCAM suite of tools, and languages like JavaScript and Python for file operations and UI.
diff --git a/_posts/academia/research/2008-11-01-hapl.md b/_posts/academia/research/2008-11-01-hapl.md
index 873907c..eeaa895 100644
--- a/_posts/academia/research/2008-11-01-hapl.md
+++ b/_posts/academia/research/2008-11-01-hapl.md
@@ -2,13 +2,16 @@
layout: job
title: "Hafter Auditory Perception Lab"
date: 2008-11-01
-categories: research hearing-science
+categories: job
---
# HAPL
+## Date Range: Nov 2008 - July 2011
+
---
-In 2008 I worked with Professor Ervin Hafter on a research project involving dichotic listening. Subjects were invited to answer questions issued in a virtual cocktail party that I programmed, both through visual stimulus on screen and a 48-channel speaker array in an anechoic chamber. The goal was to track how people pay attention with simultaneous streams of information, and how often they are able to tune in subconsciously. This work was completed with both Matlab and Max/MSP/Jitter as the framework, and the interface for the subjects was an array of switches that allowed them to answer the questions which we could then catalogue.
+In 2008-'11 I worked with Professor Ervin Hafter at the `Hafter Auditory Perception Lab` on a research project involving dichotic listening. Subjects were invited to answer questions issued in a virtual cocktail party that I programmed, both through visual stimulus on screen and a 48-channel speaker array in an anechoic chamber. The goal was to track how people pay attention with simultaneous streams of information, and how often they are able to tune in subconsciously. This work was completed with both Matlab and Max/MSP/Jitter as the framework, and the interface for the subjects was an array of switches that allowed them to answer the questions which we could then catalogue.
+
---
Goals
@@ -23,4 +26,4 @@ Goals

--->
\ No newline at end of file
+-->
diff --git a/_posts/academia/research/2015-04-20-zeropointnine.md b/_posts/academia/research/2015-04-20-zeropointnine.md
index a743a22..cc1110e 100644
--- a/_posts/academia/research/2015-04-20-zeropointnine.md
+++ b/_posts/academia/research/2015-04-20-zeropointnine.md
@@ -5,10 +5,11 @@ date: 2015-04-20
categories: research instrument
---
-# 0.9 instrument for deaf performers
-
+---

+## 0.9 instrument for deaf performers
+
[https://meyersound.com/news/tarek-atoui/](https://meyersound.com/news/tarek-atoui/)
#### Performance site: EMPAC 2016-05-13
diff --git a/_posts/academia/research/2016-04-15-tsar-bell.md b/_posts/academia/research/2016-04-15-tsar-bell.md
index b23f80e..f844711 100644
--- a/_posts/academia/research/2016-04-15-tsar-bell.md
+++ b/_posts/academia/research/2016-04-15-tsar-bell.md
@@ -5,20 +5,21 @@ date: 2016-04-15
categories: research instrument
---
-# Recasting the Tsar Bell
-
-
-
+---
I worked with [Chris Chafe](http://chrischafe.net/about-2-2/), [Greg Niemeyer](https://www.gregniemeyer.com/), [Edmund Campion](http://edmundcampion.com/), and [Perrin Meyer](http://www.lizardinthesun.com/#psm) to help realize two pieces in which a reconstructed virtual model of the Tsar Bell used as a performable element. Since this is/was one of the world's largest bells (although it never rung), it was the interest of a number of researchers to hear it. [John Granzow](http://bcnm.berkeley.edu/news-research/1568/recasting-the-tsar-bell-with-john-granzow) did a lot of work on the modeling of the bell to ensure that it was represented accurately. This involved modeling the bell with a polygonal mesh that was used to simulate frequency behavior in FEA (Finite Element Analyses). FEA basically allows for the approximation of which frequencies would be excited from the bell, if it existed. John researched specfiic metals that allowed for the matching of densities in the model, which was eventually used for the resynthesis. This resysnthesis was carried out by Chris Chafe in Faust. My role was to build externals with the Faust framework that could be struck by various types of impulses. I then created performance software that incorporated Chafe and Campion's ideas on their respective pieces.
+
+
This work was conceived of by Chris Chafe and Greg Niemeyer. A quote from Greg Niemeyer on the physical characteristics of bells:
-"Although bronze bells don't appear to be elastic, when struck, they deform. The deformation moves throughout the bell, and since the bell is round, the deformation circulates until its energy is absorbed by the environment. The sound comes from the deformation moving the air surrounding the bell. The deformations' constituent frequencies and their amplitudes form waves which define the pitch, volume and timbre of the bell's sound."
+`"Although bronze bells don't appear to be elastic, when struck, they deform. The deformation moves throughout the bell, and since the bell is round, the deformation circulates until its energy is absorbed by the environment. The sound comes from the deformation moving the air surrounding the bell. The deformations' constituent frequencies and their amplitudes form waves which define the pitch, volume and timbre of the bell's sound."`
[BCNM: Recasting the Tsar Bell](http://bcnm.berkeley.edu/news-research/1568/recasting-the-tsar-bell-with-john-granzow)
[KQED Article](https://www.kqed.org/arts/11494671/what-does-a-200-ton-bell-sound-like)
+[HAL Open Science paper on FEA & Bell modeling in Faust](https://hal.archives-ouvertes.fr/hal-02158980/file/bells-with-faust.pdf)
+
#### Performance site: UC Berkeley
[BCNM Event](http://bcnm.berkeley.edu/events/109/special-events/998/recasting-the-tsar-bell)
diff --git a/_posts/art/audio/2014-04-05-modulations.md b/_posts/art/audio/2014-04-05-modulations.md
index d81e8ae..bd4791c 100644
--- a/_posts/art/audio/2014-04-05-modulations.md
+++ b/_posts/art/audio/2014-04-05-modulations.md
@@ -5,10 +5,10 @@ date: 2014-04-05
categories: sound performance
---
-# 2014 Modulations Festival
-

+## 2014 Modulations Festival
+
Carr Wilkerson invited me to be a featured artist in 2014's Modulations festival. It was a great opportunity to catch up with a number of folks I'd gotten to know there over the years, and to spend some time working in their facilities leading up to the show. Dave Kerr did a great job documenting. I had a fun time in the collaboration with [Reza Ali](https://www.syedrezaali.com/) who evoked visual magic during my set.
[Reza's documentation](https://www.syedrezaali.com/ccrmamodulations/)
diff --git a/_posts/art/audio/2016-01-30-finite00.md b/_posts/art/audio/2016-01-30-finite00.md
index 7aad659..2599d3f 100644
--- a/_posts/art/audio/2016-01-30-finite00.md
+++ b/_posts/art/audio/2016-01-30-finite00.md
@@ -1,12 +1,10 @@
---
layout: project
-title: "Finite 00"
+title: "Finite Series: 00"
date: 2016-01-30
categories: sound performance
---
-# Finite Series #00
-
A quarterly series: Miller, Equilet, Kanaga, Merkey
Cullen Miller invited me to perform music. I utilized one of my early modular setups for a solo performance.
diff --git a/_posts/art/audio/2016-04-30-finite01.md b/_posts/art/audio/2016-04-30-finite01.md
index 1c30f62..bca248e 100644
--- a/_posts/art/audio/2016-04-30-finite01.md
+++ b/_posts/art/audio/2016-04-30-finite01.md
@@ -1,12 +1,10 @@
---
layout: project
-title: "Finite 01"
+title: "Finite Series: 01"
date: 2016-04-30
categories: sound performance
---
-# Finite Series #01
-
A quarterly series: Miller, Dunne, Equilet, Rene Hell, Easy Simple
I was invited a second time to perform music at the Finite quarterly. This time was more involved, and took a bit more time to set up. I played after hours in the dark. After Rene Hell and EasySimple (simple) performed, [Gabriel Dunne](http://gabrieldunne.com/) and [Cullen Miller](http://pointlinesurface.com) performed with a live A/V set that they designed with bespoke software.
diff --git a/_posts/art/audio/2016-04-30-finite02.md b/_posts/art/audio/2016-04-30-finite02.md
index 4669e08..a149dab 100644
--- a/_posts/art/audio/2016-04-30-finite02.md
+++ b/_posts/art/audio/2016-04-30-finite02.md
@@ -1,12 +1,10 @@
---
layout: project
-title: "Finite 02"
+title: "Finite Series: 02"
date: 2016-07-30
categories: sound performance
---
-# Finite Series #02
-
A quarterly series: Miller, Equilet, Last, Jelinek, Scy1e
I was invited a third time to perform music at the Finite quarterly, this time at Gray Area. This particular performance was more ambient and textural, and I opened for the night. It was a pleasure to perform alongside [Jan Jelinek](https://faitiche.de/t/artists/janjelinek), a long-time interest of mine. [Scy1e](https://scy1e.bandcamp.com/) played an excellent spatialized set involving electronics from Peter Blasser's lab, and [Cullen Miller](http://pointlinesurface.com/) collaborated with [David Last](https://www.facebook.com/pg/lastfaithstudio/) in a dance set.
diff --git a/_posts/art/audio/2016-09-01-bergen-assembly.md b/_posts/art/audio/2016-09-01-bergen-assembly.md
index e6eee74..8573d21 100644
--- a/_posts/art/audio/2016-09-01-bergen-assembly.md
+++ b/_posts/art/audio/2016-09-01-bergen-assembly.md
@@ -5,10 +5,12 @@ date: 2016-09-01
categories: sound performance
---
-# Infinite Ear at the Bergen Assembly
+## Infinite Ear at the Bergen Assembly
Tarek Atoui partnered with [Council](http://www.council.art) to create a masterful installation in the Sentral Badet community space of Bergen, Norway. My role was to set up the 0.9 instrument, calibrate it to the room, develop the software further with Tarek leading up to rehearsals, and train people on it through subsequent days of the exhibition prep. We worked with professional musicians and newcoming performers alike, many of them deaf persons. It was an eye opening experience to see the ways in which something of my own creation could be used to teach people how to create abstract music for the first time in their life. There were onsite interpreters to translate between sign and spoken language, and many interesting aspects of the space and partitions of the installation to explore and engage in as the time went by.
+
+
Some highlights:
- Tarek and I worked with the [Bit20 Ensemble](http://bit20.no/home-english/) to aid them in particular pieces they would air on 9/2/2016 (the opening).
@@ -16,12 +18,9 @@ Some highlights:
- [Grégory Castéra and Sandra Terdjman](http://www.council.art/) set up a museum of sonic artifacts around the interior of the space, which taught people about the ways in which sound has been interpreted and used throughout history
- There was a sound massage room where you could book an appointment for 1 hour increments
- Johannes Goebel installed his [SubBassProtoTon](http://empac.rpi.edu/events/2017/fall/subbassprototon), which is an acoustic bass generator room construction in which the occupant of the box can control a low tone with a sliding window in which air passes through and creates vibration
--
[Council's permalink](http://www.council.art/residency/760/within-infinite-ear)
-
-
---
Themes
@@ -33,8 +32,6 @@ Themes
---
-![mod]()
-
diff --git a/_posts/art/audio/2018-05-18-bischofflubowmiller.md b/_posts/art/audio/2018-05-18-bischofflubowmiller.md
index 11cdce9..79ecc85 100644
--- a/_posts/art/audio/2018-05-18-bischofflubowmiller.md
+++ b/_posts/art/audio/2018-05-18-bischofflubowmiller.md
@@ -1,16 +1,16 @@
---
layout: project
-title: "CNMAT: live electronics"
+title: "Live w/ John Bischoff"
date: 2018-05-18
categories: sound performance
---
-# CNMAT: Live Electronics w/ John Bischoff
-
CNMAT [presented](http://cnmat.berkeley.edu/events/john-bischoff-w-jeffrey-lubow-cullen-miller) an evening with John Bischoff, Cullen Miller, and myself.

+## Performances with live electronics at CNMAT
+
John Bischoff (b. 1949) has been active in the experimental music scene in the San Francisco Bay Area for over 40 years as a composer, performer, and teacher. He is known for his solo constructions in real-time synthesis and the pioneering development of computer network music. He has performed across the US, and in Europe including such venues as the Festival d'Automne in Paris, Akademie der Künst in Berlin, and Fylkingen in Stockholm. He was a founding member of The League of Automatic Music Composers (1978), considered to be the worldâs first computer network band. He is also a founding member of The Hub (1987), a network band that continues to expand on the network music form in new ways. Recordings of Bischoffâs work are available on Artifact, 23Five, Tzadik, Lovely, and New World Records. A solo CD on New World titled âAudio Combineâ was named one of the BEST OF 2012 by WIRE magazine. He is full-time faculty in the Music Department at Mills College in Oakland, Califormia.
[Bischoff website](http://www.johnbischoff.com/)
diff --git a/_posts/industry/2009-01-01-shrc.md b/_posts/industry/2009-01-01-shrc.md
index 9da6a44..4388f7a 100644
--- a/_posts/industry/2009-01-01-shrc.md
+++ b/_posts/industry/2009-01-01-shrc.md
@@ -2,15 +2,21 @@
layout: job
title: "Starkey Hearing Research Center"
date: 2009-01-01
-categories: research hearing-science
+categories: job
---
# Starkey Hearing Research Laboratories (SHRC)
+## Date Range: Jan 2009 - Jul 2011
---
-Senior developer on a research programming team for the development of hearing aids.
-Tools written in: Matlab, Max/MSP/Jitter, Python
+From Jan 2009 - Jul 2011, I occupied the role of Senior Developer on a research programming team for the development of hearing aids. I reported primarily to Kelly Fitz and Tao Zhang, but worked in tandem with Sridhar Kalluri, and Martin McKinney.
+
+Much of the work involved rapid prototyping for research subject testing, including the development of bespoke software to analyze and extract meaning out of a collected pool of subjects' data. So called "Golden Ear" specialists were invited in for our research to assist us. These experiences/preferences aided in forming a model we used to train a custom neural network, which listened to subjects' preferences involving music perception as they related to expert ground truths.
+
+
+
+Tools were written in Matlab, Max/MSP/Jitter, Python
---
@@ -19,8 +25,3 @@ Goals
- Develop and maintain software that subjects used for hearing-related experiments
- Correlate data in findings and do further analysis
- Report to SHRC team regarding updates and research goals
-
----
-
-
-
diff --git a/_posts/industry/2014-05-04-djtt.md b/_posts/industry/2014-05-04-djtt.md
index 96ae074..f67e48b 100644
--- a/_posts/industry/2014-05-04-djtt.md
+++ b/_posts/industry/2014-05-04-djtt.md
@@ -1,28 +1,37 @@
---
-layout: job
-title: "DJTT: ENTER / Orbit instrument"
+layout: project
+title: "ENTER / Orbit instrument"
date: 2014-05-04
-categories: industry product-design
+categories: art-tech
+images:
+ - path: /assets/orbit01.png
+ title: orbit 01
+ - path: /assets/orbit03.jpg
+ title: orbit 03
---
-# DJ Techtools: Orbit Instrument for Richie Hawtin's ENTER exhibition
-
---
-Plastikman's "Enter." club featured an entire level to house "Orbit". Orbit is an interactive music table that allows up to 6 performers (12 hands) to perform collaboratively.
-Worked with team at DJ Techtools to build an interactive music experience for club in Ibiza, Spain. In addition to headphones, [Subpacs](https://subpac.com/) were used in order to allow performers to differentiate the low frequencies coming from their input from adjacent rooms in which DJing was happening.
-[Enter](http://enterexperience.com/)
+Plastikman's (Richie Hawtin) "Enter." club & exhibition space featured an entire level to house "Orbit". Orbit is an interactive music table that allows up to 6 performers (12 hands) to perform collaboratively.
+Worked with team at [DJ Techtools](https://djtechtools.com/) to build an interactive music experience for club in Ibiza, Spain. In addition to headphones, [Subpacs](https://subpac.com/) were used in order to allow performers to differentiate the low frequencies coming from their input from adjacent rooms in which DJing was happening.
+
+[Main ENTER site portal](http://enterexperience.com/)
+
+[Subpac's information relating to project](https://subpac.com/enter-ibiza/)
-[Subpac](https://subpac.com/enter-ibiza/)
+
+
+
---
Roles
---
+---
- Senior software developer
- Implementation of algorithms
-- Gesture recognition
-- UI elements to tune instrument
-
-
+- Implementation of gesture recognition
+- UI elements to tune/calibrate instrument
+---
+
+{% include gallery.html %}
\ No newline at end of file
diff --git a/_posts/industry/2014-08-01-c74.md b/_posts/industry/2014-08-01-c74.md
index e9c0dd4..2de7f58 100644
--- a/_posts/industry/2014-08-01-c74.md
+++ b/_posts/industry/2014-08-01-c74.md
@@ -2,25 +2,35 @@
layout: job
title: "Cycling '74"
date: 2014-08-01
-categories: industry software
+categories: job
+images:
+ - path: /assets/c74dark.png
+ title: c74dark
+ - path: /assets/c74grey.png
+ title: c74grey
+ - path: /assets/c74light.png
+ title: c74light
---
# Cycling '74: Max 7
+## Date Range: Aug 2014 - Nov 2014
+
+---
----
Performed rigorous internal testing with David Zicarelli and other collaborators including Joshua Kit Clayton, Rob Sussman, Jeremy Bernstein, and Emmanuel Jourdan.
[Cycling HQ](http://cycling74.com/)
-David Zicarelli's [first look article] on [Max 7](https://cycling74.com/articles/a-first-peek-at-max-7)
+
+
+David Zicarelli's `first look article` on [Max 7](https://cycling74.com/articles/a-first-peek-at-max-7)
---
+{% include gallery.html %}
+
Experience
--
- Rigorous internal testing
- Internal reporting and bug tracking
- Set up various test harnesses for exploiting issues
-
-
-