Skip to search formSkip to main contentSkip to account menu

Tokenization (data security)

Known as: Token, Tokenization 
Tokenization, when applied to data security, is the process of substituting a sensitive data element with a non-sensitive equivalent, referred to as… 
Wikipedia (opens in a new tab)

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
Highly Cited
2017
Highly Cited
2017
In this paper we present two deep-learning systems that competed at SemEval-2017 Task 4 “Sentiment Analysis in Twitter”. We… 
Highly Cited
2017
Highly Cited
2017
Taxi-calling apps are gaining increasing popularity for their efficiency in dispatching idle taxis to passengers in need. To… 
Review
2010
Review
2010
Abstract This book provides system developers and researchers in natural language processing and computational linguistics with… 
Highly Cited
2006
Highly Cited
2006
We present an anomaly-based algorithm for detecting IRC-based botnet meshes. The algorithm combines an IRC mesh detection… 
Review
2005
Review
2005
We present a prototype system, code-named Pulse, for mining topics and sentiment orientation jointly from free text customer… 
Highly Cited
2004
Highly Cited
2004
To date, there are no fully automated systems addressing the community's need for fundamental language processing tools for… 
Highly Cited
2002
Highly Cited
2002
Published results indicate that automatic language identification (LID) systems that rely on multiple-language phone recognition… 
Highly Cited
2001
Highly Cited
2001
We propose a new efficient automatic verification technique, Athena, for security protocol analysis. It uses a new efficient… 
Highly Cited
1994
Highly Cited
1994
Any linguistic treatment of freely occurring text must provide an answer to what is considered as a token. In arti(cid:12)cial…