Bringing context and critique to the cultural moment. Deep dives, reviews, and debate encouraged.
40681 Members
We'll be adding more communities soon!
© 2020 Relevant Protocols Inc.
Bringing context and critique to the cultural moment. Deep dives, reviews, and debate encouraged.
40681 Members
We'll be adding more communities soon!
© 2020 Relevant Protocols Inc.
Relevant
Hot
New
Spam
Relevant
Hot
New
Spam
0
8.9K
0
8.9K
Affect as a Service: Arforum writing and A.I. “I scraped all of Artforum almost a year ago now initially as a way to do some kind of work on the nature of art writing, and awhile later Jules got in touch with me because he had used the somewhat recently released GPT-2 model for text generation on a Pitchfork dataset we had on the site. GPT-2 is the model created by OpenAI that was like this "too dangerous to release" Natural Language Processing model that they then decided to release anyway lol It became notable early on for being able to produce uncannily human-like text unlike basically any model that was previously used so we decided to apply the model to the whole Artforum dataset, which stretches back from the first Artforum piece back in what like 1960-something to I think January 2019 so that's what the generator on the page is and what I wanted to do was basically display the generator to people both at Artforum and not and get some sense of their reaction to it, to the nature of the texts it's producing, etc”
Affect as a Service: Arforum writing and A.I. “I scraped all of Artforum almost a year ago now initially as a way to do some kind of work on the nature of art writing, and awhile later Jules got in touch with me because he had used the somewhat recently released GPT-2 model for text generation on a Pitchfork dataset we had on the site. GPT-2 is the model created by OpenAI that was like this "too dangerous to release" Natural Language Processing model that they then decided to release anyway lol It became notable early on for being able to produce uncannily human-like text unlike basically any model that was previously used so we decided to apply the model to the whole Artforum dataset, which stretches back from the first Artforum piece back in what like 1960-something to I think January 2019 so that's what the generator on the page is and what I wanted to do was basically display the generator to people both at Artforum and not and get some sense of their reaction to it, to the nature of the texts it's producing, etc”
Some low-ranking comments may have been hidden.
Some low-ranking comments may have been hidden.