From the resource:
Want to build a website right in RStudio? blogdown is an R package that allows you to create websites from R markdown files using Hugo, an open-source static site generator written in Go and known for being incredibly fast.
I started the process by reading through the first few chapters of the blogdown text. It has a ton of great information, and Yihui, Amber, and Alison make the information very accessible. I dug into the installation chapter, it was also helpful for me to follow Alison Presmanes Hill’s post.
Race, Gender, and Toxicity Online Plenary Roundtable
When: 9:30 to 11 a.m. Thursday, April 25
Sponsored by: Social Science Research Council and the Center for Media Engagement in the Moody College of Communication at the University of Texas at Austin.
Plenary Roundtable: Professor Zizi A. Papacharissi, University of Illinois-Chicago; Professor Lisa Nakamura, University of Michigan; Assistant Professor Catherine Knight Steele, University of Maryland.
Moderator: Gina Masullo Chen, Assistant Director of the Center for Media Engagement and Assistant Professor of Journalism, The University of Texas at Austin.
From the ad:
The USF Libraries’ Research Platform Teams (RPT) partner with graduate students and faculty in departments and across disciplinary clusters to promote innovative, collaborative services and drive research discovery. The teams establish deep relationships with faculty and graduate students, forging active partnerships through research, publication, grant writing, teaching, and informed collection management. Librarian team leads will participate in a comprehensive approach to the research process.
The strategy combines librarians with subject expertise at or above the Master’s level with professionals possessing functional expertise in such domains as data management, GIS, statistics, AR/VR, design, 3D visualization, and more. RPTs assume responsibility for the collections that are associated with their targeted discipline/disciplinary cluster and are encouraged to teach credit-bearing courses in their assigned area.
From the resource:
In Fall 2018, Professor Molly Ball’s History 252: Immigration in the Americas students developed original research based on archival and primary sources to explore how Rochester’s own immigrant history can not only enrich our understanding of the city’s history, but also further our understanding of transnational immigrant experiences throughout the Americas. One of the ways the students’ research has been showcased was in an interactive mapping exhibit…
This project and associated exhibit required us to make some interesting technological choices: how data would be stored and updated, which mapping applications to use and how the data and applications would be configured for the exhibit.
Archives are places. They are institutions. But to archive is also an action. Web Archiving is a process that produces web archives and personal digital archiving is a set of practices for working to ensure longterm access to personal digital content.
When and how did archive become a verb? Webster’s dates the noun usage to 1603 and the verb usage to 1831, but I’m curious how obscure the verb usage was over time.
My sense/hunch has been that the verb form of archive, is tied up in the history of computing. A tape archive is a higher latency storage mechanism. There is a long standing use of “archiving” as a concept that involves writing to tape. The term tape is itself part of the name of .tar files. So, when did archive become a verb and to what extent is archiving related to the development of computing?
This kind of question is exactly the sort of thing that Google n-gram is useful for. Over time I’ve generated a few different graphs of trends around the verb usage of archive in Google books and posted them to twitter. It seemed like it would be worth taking a few minutes to explore that data a bit more. What follows is really just some initial notes on some searches. I’m curious to get other interpretations on what we learn from these charts and examples of usage.
For HASTAC, this story has particular relevance since we were founded with the conviction that the technologies emerging from Silicon Valley had to have ethical and social dimensions, including ones based on access and equity. HASTAC has deep roots in a scholarly tradition of interdisciplinary collaboration in research and teaching across domains as well as a commitment to technological innovation–and some of those roots go back to Stanford.
The organization HASTAC was founded in 2002, cofounded by myself (Duke University) and David Theo Goldberg (Director of the University of California Humanities Research Institute), and a dozen other scholars and technology practitioners across a wide variety of fields. One of the founders (and an early technology visionary) was Kevin Franklin (then Associate Director of UCHRI). UCHRI hosted HASTAC’s first face-to-face meeting in 2002 (and our first conference in 2003). We all began to imagine what an online version of our interactions, contributions, and collaborations might look like, including the UCHRI leaders plus other members of the original design team that included Ruzena Bajcsy (the eminent engineer at UC Berkeley), Kathleen Woodward (Humanities Center, University of Washington), Tara McPherson (Film and Media, USC), Anne Balsamo (Interactive Media, USC), and others. The HASTAC online network–the “world’s first and oldest academic social network” according to an NSF report–was imagined collaboratively and collectively at numerous face-to-face gatherings, including at one hosted by the cyberinfrastructure division of NSF.
At Stanford, the group involved in translating f2f cross-institutional collaboration into an interactive community-led online tool included: former Stanford Professors Jeffrey Schnapp and Tim Lenoir, librarian Henry Lowood (currently Curator for History of Science & Technology Collections and Film & Media Collections in the Stanford University Libraries) and doctoral students KC Alt, Jesse Thompson (HASTAC.org’s first Webmaster), and Zachary Pogue (HASTAC’s second Webmaster, first at Stanford, then Duke).
We had to imagine such a tool together since it didn’t yet exist. HASTAC.org began as a display website linked to this brand new thing called a “wiki, ” a website on which any registered user could contribute or modify content directly from a web browser. Ward Cunningham designed and launched the first wiki on the Internet in 1995. The most famous wiki, the online encyclopedia Wikipedia, launched on the Internet in 2001.
In 2002, the HASTAC team began to build its site and its wiki. Helping us to design the site was none other than Jimmy Wales, cocreator of Wikipedia. I believe it was in 2002 or 2003 that we first met him. I remember we first visited his office around that time—cement block walls, a few guys at desktops. We also engaged in a few early experiments with what were then called “web blogs” or “blogs” (considerably before 2009 when “Web 2.0” made “interactivity” readily available). My point: we were imagining technologies for scholarly collaboration and interaction back then that are now everywhere . . . and humanists and social scientists were there and key.
About the funding:
The Program will enable international scholars to study the digital collections at Leiden University Libraries and to collaborate with the innovative Centre for Digital Scholarship (CDS) at Leiden University. The program will financially support the research fellows to stay in Leiden for a period of two months, where fellows will be invited to share their research outcomes through public lectures and publications. These fellowships offer a new digital scholarship perspective building on the longstanding Scaliger Institute fellowship program that focused on the study of the special collections at Leiden University.
From the ad:
Reporting to the Director of Academic Data Services, the Digital Humanities Natural Language Processing (NLP) Specialist is responsible for working closely with a diverse client base comprised of faculty, students, and staff to help them utilize computationally-intensive methods and technology applied to a vast array of contemporary and historical textual source material to achieve their scholarly teaching and research goals. Support for computationally intensive textual scholarship includes cutting-edge methods and tools for text analysis, text mining, natural language processing (NLP), machine learning (neural nets, deep learning), social network analysis, web scraping, etc. Key responsibilities of this position include working with members of TTS Research Technology to provide consulting services to faculty, students, and staff from a wide range of departments and academic disciplines; to assist them in the design and development of complex projects from initial concept to delivery; and to provide support, education, and outreach…
From the announcement:
The Papers of the War Department team is happy to announce that we’re back online!
The last time you heard from us, we were putting transcriptions on hold to start a total redesign of the Papers of the War Department website with the support of a grant from the American Council of Learned Societies. This included migrating nearly 200 gigabytes of Papers of the War Department items, metadata, and image files into Omeka S, which now powers the website with a fully redesigned look and feel. We’ve also released an updated beta version of the Scripto plugin, which is facilitating transcription, with updated functionality for an easier transcription process.
When I sat down to write this post I had no ideas. That’s probably inevitable, given the year of blogging challenge that we’re undertaking in the Scholars’ Lab. The whole point is to write often and frequently, that there is value in a steady stream of thoughts rather than waiting for the perfect blog post, and that regular writing makes the whole thing easier. Still, all those good intentions didn’t help me as I struggled to put text to blank page. As I often do in those situations I got out a deck of cards and started playing.
I’ve been obsessed with Oblique Strategies for years now. If you’re not familiar, Oblique Strategies is a deck of cards published by Brian Eno and Peter Schmidt that aims to offer short, pithy suggestions for getting around creative dilemmas. The idea behind them is that the serendipity of drawing a mysterious phrase from the deck will help disrupt any blocks you might have moving forward. I’ve got a stack of them that I keep on my desk, and it’s a comfort to know that I’ve always got a wrench to throw in the gears at any given time. This morning as I flipped through the deck for inspiration these were the cards that came up first:
When is it for?
Use an old idea
Turn it upside down
Once the search is in progress, something will be found
Humanize something free of error
A lot there! Surely, somewhere in there, I could find material for a successful blog post on digital pedagogy, the subject I’ve been trying to focus on with these regular posts. I did, and I could. But the activity interested me more: how could these cards – the idea of them more so than any one phrase on them – inform teaching within the field of DH more generally?
Of course, I’m not the first one to think about how Oblique Strategies might apply to DH. Mark Sample’s keynote for the first annual Institute for Liberal Arts Digital Scholarship took up this topic. In “Your Mistake was a Vital Connection: Oblique Strategies for the Digital Humanities,” Sample does a fantastic job articulating the potential for the deck to inspire digital humanities research and pedagogy. Sample advocates not just using the deck as a means to an end – he suggests making serendipity the process and outcome itself. The deck can be quirky way to step over difficulty and get back to the serious business of doing work, but it can also offer a reconsideration of what the work could look like in the first place.
Last month, I led a workshop for the GC Digital Initiatives on “Getting Started with TEI.” For those who don’t know, TEI (short for Text Encoding Initiative) is a method for encoding, or “tagging,” texts in such a way that both humans and computers can make sense of them. It is a set of guidelines used for electronic editing and working with textual data in the humanities, social sciences and linguistics, which is based on XML (the eXtensible Markup Language). With TEI, editors and scholars can “tag” a text for various features such as structure, typography, or references. My workshop specifically focused on how TEI facilitates the digital transcription of hand-written manuscripts. We practiced encoding a couple of pages from Oscar Wilde’s manuscript of “The Picture of Dorian Gray,” paying particular attention to how Wilde edited out the homosexual elements and innuendos as he revised his draft.
In the image below, you can see how the workshop participants used TEI to mark up the revisions that Wilde made on the passage. Here, the participants went through the manuscript line by line to indicate what is written and where things are struck out, added, or cannot be read. Among the available TEI elements, we focused on <del>, <add>, and <gap>, which indicate deletions, additions, and undecipherable script, as well as <rend>, which describes how or where a piece of text is rendered, such as with a strikethrough, or above the current line. For example, in the below excerpt from our encoding, you can see the <rend> elements in orange.
At the end of the workshop, we were able to see our progress by transforming the TEI file into HTML, which we could then view in the browser. The browser rendition shows how the typographic elements appear once applied to the source text. As a result, it’s useful for presenting the manuscript details in an easy to read format. For example, an excerpt tagged with “strikethrough” would present the text with a line running across it. You can see this presentation in the image at the top of this article, displays our transformed TEI side by side with the original manuscript page. In textual editing terms, this kind of transcription is known as a “diplomatic transcription” of a manuscript.
From the ad:
The postdoctoral researcher will contribute to a project funded by the National Endowment for the Humanities to develop an online reference tool for the medieval Middle East. Specific duties include advanced data collection using the indices of published primary sources in at least two different languages; contributing to editorial review and design decisions; historical research related to the project; testing the tool before publication; presentations at academic conferences; and a public lecture at Oklahoma State University discussing the project and its outcomes. Some travel will be required to fulfill these duties. Inquiries about the project or the position may be directed to Thomas A. Carlson at firstname.lastname@example.org.
Source: Read the full ad here.
From the report:
The 2019 AHA annual meeting featured a roundtable discussion on an emerging aspect of doctoral education in history: the digital dissertation. Jeri Wieringa (George Mason Univ.) and I co-organized this panel, which sought to bring together recent graduates and current doctoral students and their advisers for a candid discussion on digital dissertations. The panel consisted of the following participants, student and adviser, grouped by dissertation project: me and Suzanne E. Smith (George Mason Univ.); Wieringa and Sharon M. Leon (Michigan State Univ.); and Zoe LeBlanc (Vanderbilt Univ.) and Madeleine Casad (Vanderbilt Univ.). Thanks to the overlap of the AHA and MLA annual meetings in Chicago this year, we were able to ask Lisa Rhody (Graduate Center, CUNY) to be chair and moderator.
From the report:
The 7th International Conference Re:Trace for the Histories of Media Art, Science and Technology featured a DARIAH Connectivity Day on November 25, 2017 at the Academy of Sciences in Vienna. This event, which was funded by the DARIAH Theme 2017, featured discussions on media art as part of our Digital Cultural Heritage, the challenges of its preservation and the role of (digital) infrastructures.
The diverse challenges in documenting and preserving media art online and offline constitute one of the most important discourses within the various disciplines from art history, image sciences to information and computer studies. Due to its dynamic, ephemeral, complex and/or processual structure and conditionality founded in digital technologies as both its medium and subject, media art cannot be preserved like object-oriented art, but any method of documentation and preservation must acknowledge it as part of a dynamic memory culture.
a simple assignment for students to explore iteration & revision.
When the blackbird flew out of sight,
It marked the edge
Of one of many circles.
— Wallace Stevens
How might we encourage students to embrace revision more fully? I ask my students to draw inspiration from Wallace Stevens’s poem “Thirteen Ways of Looking at a Blackbird” to pose “Thirteen Ways of Looking at a Thesis.”
Digital technology lends itself well to this experimental (but actually quite simple) assignment. Students begin by posing a research question and a one-to-two sentence hypothesis in response to it. Then, rather than edit that first attempt, they simply write the research question and hypothesis again, in a new way, below the first one. And proceed up to thirteen times. The goal is to keep the old while exploring iteration in service of discovering and crystallizing what one wants to argue.
What happens? The question and thesis begin to change as one rewrites them, but you can always return to earlier versions without losing their formulation in their moment of your thinking. To be sure, one can do this on paper with a pen. There is something productively counterintuitive, however, in insisting upon doing it in the digital domain. Because of the easily manipulability of text, we increasingly write digitally in ways that combine writing with revision. You edit as you write. That has its benefits, of course. But shifting between this mode and one in which what gets written gets preserved allows for a kind of tracking (to use the Microsoft Word term) of one’s thinking as a piece of analysis develops. It cuts against the technology while using it to preserve one’s thinking over time. (One can add to this assignment with subsequent work using flow chart “mind maps,” timelines, storymaps, and annotations to continue to map out lines of reasoning that build on a set of articulated and re-articulated questions.)
This post is an extended version of my paper for the April 2019 workshop held by the AHRC Research Network on Petitions and Petitioning from the Medieval Period to the Present, on the theme Petitioning in Context: when and why do petitions matter?
The network is explicitly interdisciplinary, international and comparative: it brings together social and political historians, political and social scientists, literary and media scholars, as well as officials administering legislative e-petitions systems and representatives of NGOs using e-petitions in their campaigns. In terms of chronological and geographical scope, the network covers petitions and petitioning from the medieval period to the present day in polities across Europe and North America.Why do “prosaic” petitions matter?
“Prosaic” petitions is a term I recently discovered in an article by Alan MacDonald (2018) which I was particularly taken with as a description of the London Lives petitions, and the many thousands of 17th-18th-century petitions like them.
I think they mattered, as MacDonald argues very succinctly, in two essential ways. Firstly, they mattered to petitioners.
Most of the time, petitioning concerned ordinary things, but these things were important to those making the requests…
And they provide
an important foundation from which interactions between governors and the governed can be understood… Prosaic petitioning can enhance our understanding of the shifting dynamics of power in a small, early modern state (MacDonald 2018, p.294)
Both MacDonald and Laura Stewart (2018) also use the term “everyday” petitions; these petitions definitely aren’t about supporting political causes or agitating for political change.
Nonetheless, they do embody conflict and contested authority. What I mean by that is not just that prosaic petitions are about grievances and disputes but also that, very often, those grievances concern the actions (or inaction) of social superiors or authorities, and are presented to another institutional authority for adjudication, with potentially face-threatening consequences for the subject of the complaint. However conservative the request and humble the language, that is potentially disruptive. (In fact, surely the language is “humble” precisely in order to soften the threat to the hierarchical order of things.)
How, and for whom, petitioning mattered changed significantly in London over the course of the 18th century; changes that were strongly gendered. This is my initial attempt to explore those changes.
From the CFP:
Now in its ninth year, the Digital Humanities Forum brings together faculty, graduate students, and undergraduate students from the University of Kansas and beyond to celebrate and explore digital scholarship as a diverse and growing field of humanist inquiry.
This year, the theme of the Forum is: Bodies, Justice, Futures. With this theme, the Digital Humanities Forum hopes to inspire presenters to think about the ways in which we envision and build towards just futures for individual and collective bodies from around the globe. By evoking the human body, we ask presenters to foreground humanistic inquiries of digital culture and technology, to trace continuities between historical realities and present socio-political conditions, and/or take up issues related to marginalized and invisible lived experiences. Suggested topics related to our theme’s keywords are listed below.
From the resource:
Jeopardy is a popular request from students who want an in-class review activity, but Jeopardy has some critical drawbacks. First and foremost, it asserts that there are right and wrong answers which can be condensed into minimal words. Jeopardy, by its very foundation, discourages nuance and critical thinking. It also prioritizes knowledge which is traditionally pale, male, and stale. Second, from a labour stand point, the game demands a tremendous amount of work from the professor or TA who creates it, while those who answer the questions are not compelled to demonstrate significant knowledge. There must be a better way.
The original Cards Against Humanity game is simple. Each round, one player asks a question from a black card, and everyone else answers with their funniest white card. Cards Against Environmental History (CAEH) follows this same format.
About the funding:
As part of a collaboration between Oxford and the Sorbonne, we are delighted to announce the new call for applications for a three-year fully funded fellowship open to students wishing to pursue doctoral studies in the history of science, in mathematical sciences, in digital humanities, or in computer science. Details of the fellowship are set out below, both in English and in French.
Recent progress in digital humanities has transformed research in the history of science: large quantities of data, the collation of which would formerly have required time-consuming visits to libraries and archives, have been made available; manuscript and book collections are accessible online; and investigations across a range of related resources become ever easier. In consequence, historical investigations can be contextualized better, studies of networks taken to a new level, and analysis conducted across increasingly large quantities of data and metadata.
From the resource:
APIs, or application program interfaces, are a way for people to access data in a plain text format using multiple programming languages. Many websites, organizations and services offer APIs for accessing their data, like Twitter, Wikipedia, Reddit and OpenSecrets.
This tutorial will walk you through accessing APIs from DataUSA and OpenSecrets in R Studio with the help of R’s httr and jsonlite packages.