6.唯一索引中的重复值处理
删除上述5中的索引,插入两行一样的记录
删除上述5中的索引,插入两行一样的记录
coll.dropIndex("rsc");
查看索引:
index ---------
{ "name" : "_id_" , "ns" : "ms_basic.schedule" , "key" : { "_id" : 1}}
如果现在直接在字段rsc上面创建唯一组合索引的时候肯定会报错,我们来试一试:
{ "_id" : { "$oid" : "4c2fe764dd969674cb31a0be"} , "liveStatus" : 0 , "startDate" : "2010-11-12" , "isList" : 0 , "location" : "CAG" , "posttime" : "2010-04-01 15:04:57.0" , "rsc" : "WRM160513" , "noliveTvName" : "" , "endDate" : "2010-11-12" , "endTime" : "09:30" , "isGold" : 0 , "version" : 0 , "startTime" : "09:30" , "disciplineName" : "摔跤" , "event" : "160" , "gender" : "M" , "isMedal" : 0 , "discipline" : "WR" , "venueId" : "CAG" , "phaseName" : "资格赛" , "isRecommend" : 0 , "isData" : 0 , "eventUnitType" : "HATH" , "chatroomStatus" : 0 , "isChina" : 0 , "isStartTime" : 0 , "competitionStatus" : 0 , "unit" : "13" , "tvName" : "男子古典式 - 60公斤级资格赛" , "flag" : 0 , "scheduleName" : "资格赛" , "isEndTime" : 0 , "rid" : "" , "eventName" : "男子古典式 - 60公斤级" , "phase" : "5" , "isResult" : 0}
{ "_id" : { "$oid" : "4c3472bd2f0ba08b4e097918"} , "rsc" : "WRM160513" , "scheduleName" : "kaokao"}
再次建立索引:
coll.ensureIndex(new BasicDBObject("rsc", 1),"rsc",true);
显示索引:
index ---------
{ "name" : "_id_" , "ns" : "ms_basic.schedule" , "key" : { "_id" : 1}}
说明没有创建成功,但是,居然不报错??
看看命令行:报错提示
> db.schedule.ensureIndex({rsc:1},{unique: true});
E11000 duplicate key error index: ms_basic.schedule.$rsc_1 dup key: { : "WRM160
513" }
查看表schedule的索引,我们可以看到,新创建的索引没有生成
> db.schedule.getIndexes();
[
{
"name" : "_id_",
"ns" : "ms_basic.schedule",
"key" : {
"_id" : 1
}
}
]
可以在第二个json对象加入一项dropDups: true,这样在创建唯一组合索引的时候不会报错,保留文档中第一个重复的值,其它重复的值均删除。
再次测试一下,加入dropDups选项,虽然报错了,但是唯一组合索引已经建立了。
> db.schedule.ensureIndex({rsc:1},{unique:true,dropDups:true});
成功!再次查询表中的记录,发现重复的记录已经自动删除了。
再次插入试试:
java中插入,居然还是不报错,顺利通过,可惜没有入库,看看命令行吧:
> db.schedule.insert({rsc: 'WRM160513', scheduleName: 'kaokao'});
E11000 duplicate key error index: ms_basic.schedule.$rsc_1 dup key: { : "WRM160
513" }
MongoDB官方文档的说明
A unique index cannot be created on a key that has duplicate values. If you would like to create the index anyway, keeping the first document the database indexes and deleting all subsequent documents that have duplicate values, add the dropDups option.
db.schedule.ensureIndex({rsc:1},{unique:true,dropDups:true})
本文转自jooben 51CTO博客,原文链接:http://blog.51cto.com/jooben/365915