Copy and paste the citation for this publication:

Justin Matejka, Tovi Grossman, George Fitzmaurice (2014)

Video Lens: Rapid Playback and Exploration of Large Video Collections and Associated Metadata
UIST 2014 Conference proceedings:
ACM Symposium on User Interface Software & Technology
10 pages

Baseball Video Lens

Conceptually, the Video Lens framework is general enough to accommodate any large set of videos with metadata. In practice however, an implementation for any specific usage domain may impose unique design challenges.

To test out the framework for a specific domain, we created the Baseball Video Lens system using a collection of 29 Major League Baseball games with approximately 8,000 pitches in total. Each pitch has a collection of associated metadata from the PITCHf/x system.

The purpose of the system is to allow anyone to explore and search through a large collection of baseball game videos and find relevant or interesting clips based on dynamic querying of the associated metadata.

Interface Elements

Fig 1. Baseball Video Lens user interface components.

The Baseball Video Lens interface consists of four main components: three data viewing/filtering components, and the video playback window (Fig 1). More than 40 attributes are individually mapped for each pitch in the Single Attribute Controllers including ranging from basic information such as pitch type, pitch speed, batter name, and pitcher team, to more advanced metrics such as break length, spin rate, and pitch release location.

Single Attribute Controllers

Fig 2. Dynamic highlighting of events in multiple interface components.

Throughout the interface, a consistent mapping of "1 dot = 1 pitch" is followed, that is, every dot in the interface represents a single pitch in the dataset. The UI elements in each data viewing/filtering component are tightly linked to each other and support "real-time brushing" so that hovering the cursor over one component highlights the corresponding dots/events in each of other components (Fig 2). This real time "linking and brushing" interaction can hopefully expose interactions between the attributes that you weren't even looking for, and encourage further exploration of the data.

Fig 3. Selecting the slow two-seam fastballs.

On startup, the Video Player component is cycling through all pitches in the dataset. There are two ways to filter the collection of pitches. The first is to select a value, or range of values from the Single Attribute Controllers. In the example in Figure 3, we first click on 'FT' to select the two-seam fastballs, and the other controllers immediately update to show the selection. Next, we drag over a range in the "speed" controller to select the pitches between 89 and 83 mph.

Multi-Attribute Grid

The second way to select a set of pitches is by using the Multi-Attribute Grid. Initially, the Multi-Attribute Grid is set up displaying the Horizontal and Vertical location of the pitch when it crossed the plate. A lasso selection tool allows for selecting a set of pitches within the view, and in Figure 4 we are selecting a set of pitches below the strikezone.

Fig 4. Selecting pitches that near the bottom of the strikezone.

The real power of the Multi-Attribute Grid however is that you can drag any of the pitch attributes up to the grid to see how they interact. In Figure 5 for example, we drag the "speed" parameter to the color control, so the pitches are color coded based on speed. Then we drag the "type" and "speed" attributes to the grid axes, so we can compare how the speeds of the different pitch types compare to each other. Finally we select the slow two-seam fastballs for viewing.

Fig 5. Changing the variables displayed in the multi-attribute grid, and showing a different way to select the slow two-seam fastballs.

Any of the 40+ variables can be placed on either axis or the color component of the Multi-Attribute Grid, and any of the continuous variables can be mapped to the point size. This allows for thousands of different visualizations of the data to be created with very little effort (Fig 6).

Fig 6. Various views of the data in the Multi-Attribute Grid.

Example Use Cases

The following video shows the process and results for three example tasks:

  • Finding the lowest pitches hit for a home run.
  • Viewing the swings at pitches furthest from the middle of the plate.
  • Watching all double plays hit to the right side of the infield.

We think the Baseball Video Lens system could be useful for a number of different groups, including serious baseball fans, baseball teams/scouting departments, broadcast organizations, and baseball journalists. While each of these user groups might have different goals, they could all use the ability to find collections of clips within a large set of baseball videos.

More Information

For more information, please see the research paper or watch the longer video embedded below.
We have a limited number of demo hard-drives available. If you work for a MLB team or another organization which might be interested in exploring the Video Lens system further (or have any other general questions), please contact Justin Matejka through email or Twitter @JustinMatejka.