1071. Speech Patterns (25)-PAT甲级真题(map应用)

People often have a preference among synonyms of the same word. For example, some may prefer "the police", while others may prefer "the cops". Analyzing such patterns can help to narrow down a speaker's identity, which is useful when validating, for example, whether it's still the same person behind an online avatar.

Now given a paragraph of text sampled from someone's speech, can you find the person's most commonly used word?

Input Specification:

Each input file contains one test case. For each case, there is one line of text no more than 1048576 characters in length, terminated by a carriage return '\n'. The input contains at least one alphanumerical character, i.e., one character from the set [0-9 A-Z a-z].

Output Specification:

For each test case, print in one line the most commonly occurring word in the input text, followed by a space and the number of times it has occurred in the input. If there are more than one such words, print the lexicographically smallest one. The word should be printed in all lower case. Here a "word" is defined as a continuous sequence of alphanumerical characters separated by non-alphanumerical characters or the line beginning/end.

Note that words are case insensitive.

Sample Input:

Can1: "Can a can can a can? It can!"

Sample Output:

can 5

题目大意:统计单词个数~大小写字母+数字的组合才是合法的单词,给出一个字符串,求出现的合法的单词的个数最多的那个单词,以及它出现的次数。如果有并列的,那么输出字典序里面的第一个~~

分析:用map很简单的~不过呢~有几个注意点~
1. 大小写不区分,所以统计之前要先s[i] = tolower(s[i]);
2. [0-9 A-Z a-z]可以简写为cctype头文件里面的一个函数isalnum~~
3. 必须用getline读入一长串的带空格的字符串~~
4. 一定要当t不为空的时候m[t]++,因为t为空也会被统计的!!!~~
5. 最重要的是~如果i已经到了最后一位,不管当前位是不是字母数字,都得将当前这个t放到map里面(只要t长度不为0)~

#include <iostream>
#include <map>
#include <cctype>
using namespace std;
int main() {
    string s, t;
    getline(cin, s);
    map<string, int> m;
    for(int i = 0; i < s.length(); i++) {
        if(isalnum(s[i])) {
            s[i] = tolower(s[i]);
            t += s[i];
        }
        if(!isalnum(s[i]) || i == s.length() - 1){
            if(t.length() != 0) m[t]++;
            t = "";
        }
    }
    int maxn = 0;
    for(auto it = m.begin(); it != m.end(); it++) {
        if(it->second > maxn) {
            t = it->first;
            maxn = it->second;
        }
    }
    cout << t << " " << maxn;
    return 0;
}

  • 4
    点赞
  • 7
    收藏
    觉得还不错? 一键收藏
  • 4
    评论
Preface Acknowledgment Chapter 1—Introduction 1.1 Pattern Recognition Systems 1.2 Motivation For Artificial Neural Network Approach 1.3 A Prelude To Pattern Recognition 1.4 Statistical Pattern Recognition 1.5 Syntactic Pattern Recognition 1.6 The Character Recognition Problem 1.7 Organization Of Topics References And Bibliography Chapter 2—Neural Networks: An Overview 2.1 Motivation for Overviewing Biological Neural Networks 2.2 Background 2.3 Biological Neural Networks 2.4 Hierarchical Organization in the Brain 2.5 Historical Background 2.6 Artificial Neural Networks References and Bibliography Chapter 3—Preprocessing 3.1 General 3.2 Dealing with Input from a Scanned Image 3.3 Image Compression 3.3.1 Image Compression Example 3.4 Edge Detection 3.5 Skeletonizing 3.5.1 Thinning Example 3.6 Dealing with Input From a Tablet 3.7 Segmentation References and Bibliography Chapter 4—Feed-Forward Networks with Supervised Learning 4.1 Feed-Forward Multilayer Perceptron (FFMLP) Architecture 4.2 FFMLP in C++ 4.3 Training with Back Propagation 4.3.1 Back Propagation in C++ 4.4 A Primitive Example 4.5 Training Strategies and Avoiding Local Minima 4.6 Variations on Gradient Descent 4.6.1 Block Adaptive vs. Data Adaptive Gradient Descent 4.6.2 First-Order vs. Second-Order Gradient Descent 4.7 Topology 4.8 ACON vs. OCON 4.9 Overtraining and Generalization 4.10 Training Set Size and Network Size 4.11 Conjugate Gradient Method 4.12 ALOPEX References and Bibliography Chapter 5—Some Other Types of Neural Networks 5.1 General 5.2 Radial Basis Function Networks 5.2.1 Network Architecture 5.2.2 RBF Training 5.2.3 Applications of RBF Networks 5.3 Higher Order Neural Networks 5.3.1 Introduction 5.3.2 Architecture 5.3.3 Invariance to Geometric Transformations 5.3.4 An Example 5.3.5 Practical Applications References and Bibliography Chapter 6—Feature Extraction I: Geometric Features and Transformations 6.1 General 6.2 Geometric Features (Loops, Intersections, and Endpoints) 6.2.1 Intersections and Endpoints 6.2.2 Loops 6.3 Feature Maps 6.4 A Network Example Using Geometric Features 6.5 Feature Extraction Using Transformations 6.6 Fourier Descriptors 6.7 Gabor Transformations and Wavelets References And Bibliography Chapter 7—Feature Extraction II: Principal Component Analysis 7.1 Dimensionality Reduction 7.2 Principal Components 7.2.1 PCA Example 7.3 Karhunen-Loeve (K-L) Transformation 7.3.1 K-L Transformation Example 7.4 Principal Component Neural Networks 7.5 Applications References and Bibliography Chapter 8—Kohonen Networks and Learning Vector Quantization 8.1 General 8.2 The K-Means Algorithm 8.2.1 K-Means Example 8.3 An Introduction To The Kohonen Model 8.3.1 Kohonen Example 8.4 The Role Of Lateral Feedback 8.5 Kohonen Self-Organizing Feature Map 8.5.1 SOFM Example 8.6 Learning Vector Quantization 8.6.1 LVQ Example 8.7 Variations On LVQ 8.7.1 LVQ2 8.7.2 LVQ2.1 8.7.3 LVQ3 8.7.4 A Final Variation Of LVQ References And Bibliography Chapter 9—Neural Associative Memories and Hopfield Networks 9.1 General 9.2 Linear Associative Memory (LAM) 9.2.1 An Autoassociative LAM Example 9.3 Hopfield Networks 9.4 A Hopfield Example 9.5 Discussion 9.6 Bit Map Example 9.7 Bam Networks 9.8 A Bam Example References And Bibliography Chapter 10—Adaptive Resonance Theory (ART) 10.1 General 10.2 Discovering The Cluster Structure 10.3 Vector Quantization 10.3.1 VQ Example 1 10.3.2 VQ Example 2 10.3.3 VQ Example 3 10.4 Art Philosophy 10.5 The Stability-Plasticity Dilemma 10.6 ART1: Basic Operation 10.7 ART1: Algorithm 10.8 The Gain Control Mechanism 10.8.1 Gain Ccontrol Example 1 10.8.2 Gain Control Example 2 10.9 ART2 Model 10.10 Discussion 10.11 Applications References and Bibliography Chapter 11—Neocognitron 11.1 Introduction 11.2 Architecture 11.3 Example of a System with Sample Training Patterns References and Bibliography Chapter 12—Systems with Multiple Classifiers 12.1 General 12.2 A Framework for Combining Multiple Recognizers 12.3 Voting Schemes 12.4 The Confusion Matrix 12.5 Reliability 12.6 Some Empirical Approaches References and Bibliography Index

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 4
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值