A. Hongcow Learns the Cyclic Shift #385

A. Hongcow Learns the Cyclic Shift
time limit per test
2 seconds
memory limit per test
256 megabytes
input
standard input
output
standard output

Hongcow is learning to spell! One day, his teacher gives him a word that he needs to learn to spell. Being a dutiful student, he immediately learns how to spell the word.

Hongcow has decided to try to make new words from this one. He starts by taking the word he just learned how to spell, and moves the last character of the word to the beginning of the word. He calls this a cyclic shift. He can apply cyclic shift many times. For example, consecutively applying cyclic shift operation to the word "abracadabra" Hongcow will get words "aabracadabr", "raabracadab" and so on.

Hongcow is now wondering how many distinct words he can generate by doing the cyclic shift arbitrarily many times. The initial string is also counted.

Input

The first line of input will be a single string s (1 ≤ |s| ≤ 50), the word Hongcow initially learns how to spell. The string s consists only of lowercase English letters ('a'–'z').

Output

Output a single integer equal to the number of distinct strings that Hongcow can obtain by applying the cyclic shift arbitrarily many times to the given string.

Examples
input
abcd
output
4
input
bbb
output
1
input
yzyz
output
2
Note

For the first sample, the strings Hongcow can generate are "abcd", "dabc", "cdab", and "bcda".

For the second sample, no matter how many times Hongcow does the cyclic shift, Hongcow can only generate "bbb".

For the third sample, the two strings Hongcow can generate are "yzyz" and "zyzy".



原题链接http://codeforces.com/contest/745/problem/A
/*
         _...---.._
       ,'          ~~"--..
      /                   ~"-._
     /                         ~-.
    /              .              `-.
    \             -.\                `-.
     \              ~-.                 `-.
  ,-~~\                ~.
 /     \                 `.
.       \                  `.
|        \                   .
|         \                   \
 .         `.                  \
             \                  \
  `           `.                 \
   `           \.                 \
    `           \`.                \
     .           \ -.               \
     `               -.              \
      .           `    -              \  .
      `            \    ~-             \
       `            .     ~.            \
        .            \      -_           \
        `                     -           \
         .            |        ~.          \
         `            |          \          \
          .           |           \          \
          `           |            `.         \
           `          `              \         \
            .          .              `.        `.
            `          :                \         `.
             \         `                 \          `.
              \         .                 `.         `~~-.
               \        :                   `         \   \
                .        .                   \         : `.\
                `        :                    \        |  | .
                 \        .                    \       |  |
                  \       :                     \      `  |  `
                   .                             .      | |_  .
                   `       `.                    `      ` | ~.;
                    \       `.                    .      . .
                     .       `.                   `      ` `
                     `.       `._.                 \      `.\
                      `        <  \                 `.     | .
                       `       `   :                 `     | |
                        `       \                     `    | |
                         `.     |   \                  :  .' |
"Are you crying? "        `     |    \                 `_-'  |
  "It's only the rain."  :    | |   |                   :    ;
"The rain already stopped."`    ; |~-.|                 :    '
  "Devils never cry."       :   \ |                     `   ,
                            `    \`                      :  '
                             :    \`                     `_/
                             `     .\       "For we have none. Our enemy shall fall."
                              `    ` \      "As we apprise. To claim our fate."
                               \    | :     "Now and forever. "
                                \  .'  :    "We'll be together."
                                 :    :    "In love and in hate"
                                 |    .'
                                 |    :     "They will see. We'll fight until eternity. "
                                 |    '     "Come with me.We'll stand and fight together."
                                 |   /      "Through our strength We'll make a better day. "
                                 `_.'       "Tomorrow we shall never surrender."
     sao xin*/
#include <bits/stdc++.h>
using namespace std;


#define LL long long
#define INF 0x3f3f3f3
#define pi acos(-1)
//const LL INF = 1e15;
const int maxn=1e3+5;
const int maxx=1e5+5;
//const double q = (1 + sqrt(5.0)) / 2.0;   // 黄金分割数
/*
std::hex    <<16进制   cin>>std::hex>>a>>std::hex>>b
cout.setf(ios::uppercase);cout<<std::hex<<c<<endl;
//f[i]=(i-1)*(f[i-1]+f[i-2]);   错排
priority_queue<int,vector<int>,greater<int> >que3;//注意“>>”会被认为错误,
priority_queue<int,vector<int>,less<int> >que4;最大值优先
//str tmp vis val cnt  2486
struct point {
    int id;
    int ed;
    bool operator < (const point &a) const {
        return ed > a.ed;//结束时间早的优先级高
    }
} p;
*/
string in;
set<string> s;
int main()
{
	while(cin>>in)
	{
		int len = in.length();
		for(int i=0;i<len;i++)
		{
			in += in[i];
		}
		for(int i=0;i<len;i++)
		{
			string tmp;
			for(int j=0;j<len;j++)
			{
				tmp += in[i+j];
			}
			//cout<<tmp<<endl;
			s.insert(tmp);
		}
		cout<<s.size()<<endl;
	}
	return 0;
}



string的应用  也可以用len的因子 比如4-2-1 因为如果要重复出现的话,必然是有重复单元,这里给出思路,人懒没写代码







Compared with homogeneous network-based methods, het- erogeneous network-based treatment is closer to reality, due to the different kinds of entities with various kinds of relations [22– 24]. In recent years, knowledge graph (KG) has been utilized for data integration and federation [11, 17]. It allows the knowledge graph embedding (KGE) model to excel in the link prediction tasks [18, 19]. For example, Dai et al. provided a method using Wasser- stein adversarial autoencoder-based KGE, which can solve the problem of vanishing gradient on the discrete representation and exploit autoencoder to generate high-quality negative samples [20]. The SumGNN model proposed by Yu et al. succeeds in inte- grating external information of KG by combining high-quality fea- tures and multi-channel knowledge of the sub-graph [21]. Lin et al. proposed KGNN to predict DDI only based on triple facts of KG [66]. Although these methods have used KG information, only focusing on the triple facts or simple data fusion can limit performance and inductive capability [69]. Su et al. successively proposed two DDIs prediction methods [55, 56]. The first one is an end-to-end model called KG2ECapsule based on the biomedical knowledge graph (BKG), which can generate high-quality negative samples and make predictions through feature recursively propagating. Another one learns both drug attributes and triple facts based on attention to extract global representation and obtains good performance. However, these methods also have limited ability or ignore the merging of information from multiple perspectives. Apart from the above, the single perspective has many limitations, such as the need to ensure the integrity of related descriptions, just as network-based methods cannot process new nodes [65]. So, the methods only based on network are not inductive, causing limited generalization [69]. However, it can be alleviated by fully using the intrinsic property of the drug seen as local information, such as chemical structure (CS) [40]. And a handful of existing frameworks can effectively integrate multi-information without losing induction [69]. Thus, there is a necessity for us to propose an effective model to fully learn and fuse the local and global infor- mation for improving performance of DDI identification through multiple information complementing.是什么意思
06-11
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值