User testing is an integral component of user-centered design, but has only rarely been applied to visualization for cyber security applications. This article presents the results of a comparative evaluation between a visualization-based application and a more traditional, table-based application for analyzing computer network packet captures. We conducted this evaluation as part of the user-centered design process. Participants performed both structured, well-defined tasks and exploratory, open-ended tasks with both tools. We measured accuracy and efficiency for the well-defined tasks, number of insights was measured for exploratory tasks and user perceptions were recorded for each tool. The results of this evaluation demonstrated that users performed significantly more accurately in the well-defined tasks, discovered a higher number of insights and demonstrated a clear preference for the visualization tool. The study design presented may be useful for future researchers performing user testing on visualization for cyber security applications.