Twitter Follower / Friend Assessment Tool (TWIFFA)
Navy STTR 2020.A - Topic N20A-T017
ONR - Mr. Steve Sullivan email@example.com
Opens: January 14, 2020 - Closes: February 12, 2020 (8:00 PM ET)
Human Systems, Information Systems
Supports approved FY20 Tech Candidate: Cyber Information Environment for
Assessment Nexus (CIEAN)
OBJECTIVE: Develop a
capability to detect suspected "bot" (artificial) accounts, using a
probabilistic model that should have a greater than 60% accuracy in detecting
bots or bot-assisted accounts in Twitter. Desired qualities include ease of
use, ability to monitor or flag suspected bots, and to identify, block, or
unfollow suspect accounts with explanations available to help the user better
understand the bots that they may interact with on Twitter.
DESCRIPTION: This STTR
topic seeks development of a web-based tool or app that can be used to assess
an account’s followership and friends to indicate which of these accounts are
likely to be bots or bot-assisted. The tool would require the user to input
their credentials; and would then scan the followers and friends for signs that
these accounts were bots or bot-assisted according to an internal, proprietary
model. The tool would provide situation awareness that will enable users to
unfollow or block bots easily or to monitor a bot for a short period to assess
its activities. The bot detection model should have an accuracy of greater than
60%. The tool needs to have the capability to assess the threat level of the
dormant bots. Once identified, the tool should have the ability to unfollow the
bots in batches and to block bots, if the user desires.
PHASE I: Develop and/or
improve algorithms for detecting bots in Twitter to develop a model for bot
detection, ideally improving or adapting an existing model or set of
algorithms. Create a system for acquiring the data on followers and friends
from a user’s Twitter account and create a simple web-based prototype, suitable
for testing and validation. Develop a Phase II plan.
PHASE II: Create a
mobile application and a web-based tool from the prototype. Ensure that model
results would be exportable to other tools. Develop a user-friendly interface
available for testing and evaluation. Include highly desirable built-in help
features and guidance capabilities. Develop additional requirements for Phase
III through engagement with stakeholders and potential customers.
PHASE III DUAL USE
APPLICATIONS: Make the technology available on My Navy Portal. Expansion and
development of models and capabilities, including functions to create a
database of dormant bots, interoperable with other tools, is desirable.
Capabilities to manage the database and deal with the needs of multiple
customers would be developed. Both web-based and app tools would be of great
utility to corporations, agencies, and individual users.
1. @DFRab “#Botspot:
Twelve Ways to Spot a Bot.” Medium blog, 28 August 2017. https://medium.com/dfrlab/botspot-twelve-ways-to-spot-a-bot-aedc7d9c110c
2. Beskow, David M. and
Carley, Kathleen. “Social Cybersecurity: An Emerging National Security Requirement.”
English Military Review, March-April 2019. https://www.armyupress.army.mil/Portals/7/military-review/Archives/English/MA-2019/Beskow-Carley-Social-Cyber.pdf
3. Brundage, Mike, et.
al. “The Malicious Use of Artificial Intelligence: Forecasting, Prevention and
Mitigation.” Future of Humanity Institute, University of Oxford: Arizona
State University, February 2018. https://arxiv.org/ftp/arxiv/papers/1802/1802.07228.pdf
4. Lapowsky, Issie.
“NATO Group Catfished Soldiers to Prove a Point about Accuracy.” Wired. https://www.wired.com/story/nato-stratcom-catfished-soldiers-social-media/
5. Nimmo, Ben,
Czuperski, Maks and Brookie. Graham. “#BotSpot: The Intimidators.” @DFRLab
6. Schreckinger, Ben.
“How Russia Targets the U.S. Military.” Politico, June 12, 2017. https://www.politico.com/magazine/story/2017/06/12/how-russia-targets-the-us-military-215247
KEYWORDS: C4ISR; Cyber
Terrorism; Hybrid; Cyborg; Smart Botnets; Information Operations; Defensive