mbdavid / litedb Goto Github PK
View Code? Open in Web Editor NEWLiteDB - A .NET NoSQL Document Store in a single data file
Home Page: http://www.litedb.org
License: MIT License
LiteDB - A .NET NoSQL Document Store in a single data file
Home Page: http://www.litedb.org
License: MIT License
Is it possible password protect a LiteDB as you would do on access?
I have a model with an interface collection property, ICollection
, and seem to be able to store my models OK (at least, I don't get any errors, and the file size looks about right).
But when I try to read any of these models back from the LiteDB collection, the collection property contains no elements. Note that it works OK if I use a concrete type, like List
.
Currently, the GetCollection method requires a string:name parameter.
Since a .db file is limited to storing one type of entity, I'm not sure why a name is required at all.
But if it must exist, it would be nice if it was at least changed to be an optional parameter.
Maybe something like:
public LiteCollection<T> GetCollection<T>(string name = null)
{
name = name ?? typeof(T).Name;
...
}
I have the following code:
public class NDocument {
[BsonId]
public string Id { get; set; }
public string Version { get; set; }
public string Name { get; set; }
public bool Split { get; set; }
public NDocumentRecordBase[] Records { get; set; }
public NDocument() {
}
}
public class NDocumentRecordBase {
public int Number { get; set; }
public int Type { get; set; }
}
[Serializable]
public class NDocumentRecord<T> : NDocumentRecordBase {
public T[] Items { get; set; }
}
class Program {
//[Serializable]
static void Main(string[] args) {
using (var db = new LiteDatabase(@"filename=D:\LiteDb\MyData.db;journal=false")) {
// Get a collection (or create, if not exits)
var col = db.GetCollection<NDocument>("customers");
var timer = new Stopwatch();
timer.Start();
foreach (var index in Enumerable.Range(1, 1000)) {
var nDoc = new NDocument() {
Id = "G" + index,
Name = "Something",
Split = false,
Version = "A",
};
if (index % 2 == 0) {
nDoc.Records = Enumerable.Range(1, 100).Select(x => new NDocumentRecord<int>() {
Items = Enumerable.Range(1, 100).ToArray(),
Number = 1,
Type = 2
}).ToArray();
} else if (index % 3 == 0) {
nDoc.Records = Enumerable.Range(1, 100).Select(x => new NDocumentRecord<string>() {
Items = Enumerable.Range(1, 100).Select(ind => "String" + ind).ToArray(),
Number = 1,
Type = 3
}).ToArray();
} else {
nDoc.Records = Enumerable.Range(1, 100).Select(x => new NDocumentRecord<double>() {
Items = Enumerable.Range(1, 100).Select(y => (double)y).ToArray(),
Number = 1,
Type = 4
}).ToArray();
}
col.Insert(nDoc);
}
timer.Stop();
Console.WriteLine("Time taken {0} ms", timer.ElapsedMilliseconds);
timer.Reset();
timer.Start();
var de = col.FindAll().ToArray();
foreach (var dd in de) {
//Console.Write(dd.Name);
}
timer.Stop();
Console.WriteLine("Time taken {0} ms for {1}", timer.ElapsedMilliseconds, de.Count());
}
Console.ReadKey();
}
}
The result is like this
Time taken to write 10239 ms for 1000
Time taken to read 9108 ms for 1000
OK Write is slow which I think its normal?
Thanks
Hello, first congratulations face, its design is amazing, really enjoyed and would like to help, for now I'll post this problem, I took a look at the source, but did not have time to try to solve it. It seems that the depth of serialization counter is not counting correctly when serializes lists.
Excuse my English, I am Brazilian and I used google translate.
Hi,
I'm using LiteDB with NancyFX to make a small stub app. While trying running around 20 requests all at once ran into this issue.
It looks like Nancy is spawning threads to deal with the load, and LiteDB is having issues to keep up and locks.
Is LiteDB threadsafe?
Shall I use locking to get around this issue?
Am I missing something?
Thanks.
An exception of type 'LiteDB.LiteException' occurred in LiteDB.dll but was not handled in user code
Additional information: Timeout. Database is locked for more than 00:01:00
at LiteDB.DiskService.TryExec(TimeSpan timeout, Action action)
at LiteDB.DiskService.Lock()
at LiteDB.RecoveryService.DoRecovery(BinaryReader reader)
at LiteDB.RecoveryService.<TryRecovery>b__0(FileStream stream)
at LiteDB.RecoveryService.OpenExclusiveFile(String filename, Action`1 success)
at LiteDB.RecoveryService.TryRecovery()
at LiteDB.LiteDatabase..ctor(String connectionString)
at NancyStub.CustomersModule.<.ctor>b__e(Object _) in c:\module.cs:line 103
at CallSite.Target(Closure , CallSite , Func`2 , Object )
at Nancy.Routing.Route.<>c__DisplayClass4.<Wrap>b__3(Object parameters, CancellationToken context)
Add a large file and then file delete (using FileStorage.Delete(fileID))
But, does not decrease the size of the database file.
I want to reduce the file size. is there any options?
I have this class which I want to insert and update :
public class UnitCard {
[BsonId]
public String name { get; set; }
public String hp { get; set; }
}
I update/insert it with the following code
if (cardsCollection.Exists(c => c.name.Equals(card.name))) {
if (!cardsCollection.Update(card)) {
throw new CouldNotSaveCardException(); // My own exception
};
} else {
cardsCollection.Insert(card);
}
"cardsCollection.Update(card)" always return false.
I tried with "cardsCollection.Update(card.name, card)" with no success, and even tried with
UnitCard found = cardsCollection.FindOne(c => c.name.Equals(card.name));
cardsCollection.Update(found.name, card);
without success either.
I tried using ObjectIDs and updating worked as intended, but my wanted behavior really is with [BsonID]
Hi !
I am a coder from china. I have fork a blog called MZBlog, and change its db to LiteDB. This is MZlog-LiteDB.
I have not used LiteDB before , therefore if there is any mistake, please point it out for me.
And I have used your FileDB.
I am eagerly looking forward to your kindly reply!
Best Regards
So the case which did not work was when you put the property value to be Enummerable.Range(1,9999) then it failes with exception index out of bound.
You/me can create a seperate issue if you like for this.
it works in mongodb.
If I use a POCO with a nullable decimal, it looks to be stored correctly as a double, but I get an System.InvalidCastException when I try to retrieve the object.
Fetching it as a BsonDocument works fine, and using a non nullable decimal also works fine.
Hello,
I would like to know if you have planned to release a portable class library for Windows Phone and Windows 8. Thanks in advance for your reply.
I have a small WCF web service.
Now it has few web sites packed to LiteDB.
When user requests web files it opens db and returns files and scripts.
It works ok.
I'd like to use LiteDB file (read-only) as embedded resource of this NET app
and open it when need.
One of reasons is: when it hosted in IIS sometimes it cannot open file because of permissions etc.
If it is resource no this kind of problems.
Is it possible to open LiteDB from NET resource?
From stream, memory?
Hi David,
I was playing around with LiteDb by changing the max size of doc to be ~16Mb
public const int MAX_DOCUMENT_SIZE = 4079 * BasePage.PAGE_AVAILABLE_BYTES;
And here is my test code: http://1drv.ms/1GyVN8v
I checked in the inspect window and found that values for property records (in my example) was not coming correctly.
I did not investigate further assuming you know whats going wrong.
Hello everyone,
I have a one-to-many relationship and i want to model the relations independent so i use three collections.
one: the one-object
second: the n-objects
third: a link-collection
furthermore i want to use Strings as primary-Key ( Guid )
Here my example, I have no idea to fix the classcast-Exception. Maybe one of you can help me ? I am out of ideas
Thank you much
Martin.
using GraphDMS.Core.Model.Core;
using LiteDB;
using System;
using System.Collections.Generic;
namespace testcase
{
public class InfoElement
{
public InfoElement() {}
[BsonId]
public string Id { get; set; } // Guid.toString()
public string Name { get; set; }
}
// many : colChildren
public class InfoElementItem
{
public InfoElementItem() {}
[BsonId]
public string Id { get; set; } // Guid.toString()
public string Name { get; set; }
}
public class LinkWrapper
{
public LinkWrapper() { }
[BsonId]
public string Id { get; set; }
public string InfoElementId { get; set; }
public List<String> FkListInfoElementItem { get; set; }
}
class Program
{
static void Main(string[] args)
{
// Open database (or create if not exits)
using (var db = new LiteDatabase(@"C:\temp\test.db"))
{
InfoElement ie = new InfoElement();
ie.Id = Guid.NewGuid().ToString();
db.BeginTrans();
var colInfoElementList = db.GetCollection<InfoElement>("InfoElementList");
colInfoElementList.Insert(ie);
var colInfoElementItemList = db.GetCollection<InfoElementItem>("InfoElementItemList");
InfoElementItem uno = new InfoElementItem();
uno.Id = Guid.NewGuid().ToString();
colInfoElementItemList.Insert(uno);
InfoElementItem dos = new InfoElementItem();
dos.Id = Guid.NewGuid().ToString();
colInfoElementItemList.Insert(dos);
// link some kids
var colLinkWrapper = db.GetCollection<LinkWrapper>("LinkWrapperList");
LinkWrapper link = new LinkWrapper();
link.InfoElementId = ie.Id;
link.FkListInfoElementItem = new List<String>();
link.FkListInfoElementItem.Add(uno.Id);
link.FkListInfoElementItem.Add(dos.Id);
colLinkWrapper.Insert(link);
db.Commit();
}
// query
using (var db = new LiteDatabase(@"C:\temp\test.db"))
{
var colInfoElementList = db.GetCollection<InfoElement>("InfoElementList");
var colLinkWrapper = db.GetCollection<LinkWrapper>("LinkWrapperList");
colLinkWrapper.EnsureIndex(x => x.InfoElementId);
foreach (InfoElement element in colInfoElementList.FindAll())
{
// {"Das Objekt des Typs \"LiteDB.ObjectId\" kann nicht in Typ \"System.String\" umgewandelt werden."}
// LiteDB.ObjectId cannot be cast to System.String
foreach (var tagElement in colLinkWrapper.Find(Query.EQ("InfoElementId", element.Id)))
{
// here i want todo some stuff
Console.WriteLine("I want to come here!");
}
}
}
}
}
}
Hello,
Is it possible to increase the document size limit from 1Mb to 16Mb like how mongodb has? If not then what is the complication?
Thanks
I can't seem to get my sub-document to save under it's parent.
var settings = new AlertSettings();
settings.Recipients.Add(new EmailRecipient());
using (var db = new LiteEngine(_dbPath))
{
Collection<AlertSettings> col = db.GetCollection<AlertSettings>("alertSettings");
col.EnsureIndex(t => t.Id);
col.Insert(settings);
Assert.That(col.FindOne(t => t.Id == settings.Id).Recipients.Count, Is.EqualTo(1)); //Assert fails
}
Am I doing something wrong?
Is it possible to load a many-to-many relationship? The appropriate keys are in the data store, I just need to be able to load them into the appropriate object.
Hello David,
I see LiteDB is catching lot of eyes and its an excellent project. Sometimes we require some help or queries or discussion for which I sometimes personally mail you or creat an issue.
Instead of this can we start using https://gitter.im . Its an excellent tool and lot of big projects have started using it.
It woudl be nice to have the same for LiteDb we have a good growing community.
Thanks
I created a quick sample application that stores a number of jpg images using FileStorage. The application is a web application that exposes an endpoint where the images can be downloaded. I created a sample web page that references a number of the images I have stored. When the requests hit the endpoint simultaneously, just about every other request throws this exception:
System.InvalidOperationException: Collection was modified after the enumerator was instantiated.
at System.ThrowHelper.ThrowInvalidOperationException(ExceptionResource resource)
at System.Collections.Generic.SortedSet1.Enumerator.MoveNext() at System.Collections.Generic.SortedDictionary
2.Enumerator.MoveNext()
at System.Collections.Generic.SortedDictionary2.ValueCollection.Enumerator.MoveNext() at System.Linq.Enumerable.WhereSelectEnumerableIterator
2.MoveNext()
at System.Collections.Generic.List1..ctor(IEnumerable
1 collection)
at System.Linq.Enumerable.ToList[TSource](IEnumerable`1 source)
at LiteDB.CacheService.RemoveExtendPages()
at LiteDB.LiteFileStream.GetChunkData(Int32 index)
at LiteDB.LiteFileStream..ctor(LiteEngine engine, FileEntry entry)
at LiteDB.FileEntry.OpenRead()
I am guessing here, but it looks like RemoveExtendPages method doesn't have a lock on the collection it is enumerating?
I've upped the number of records added by the demo application. When the file size hits about 4GB, I get a failure. Anything I should try?
Hi! I am trying to set BsonIndex attribute with parameters, like in example but it seems that it's impossible. Compiler says "An attribute argument must be a constant expression, typeof expression or array creation expression of an attribute parameter type".
Am I missing something obvious or this is wrong attribute implementation?
public class Customer
{
public int Id { get; set; }
[BsonIndex]
public string Name { get; set; }
// written below doesn't compiles
[BsonIndex(new IndexOptions { Unique = true, IgnoreCase = false, RemoveAccents = false })]
public string Email { get; set; }
}
Is it possible to include auto-incrementing Ids? by using an attribute on the POCO for instance.
That would be awesome!!!
Awesome Project By The Way
Hi,
I was wondering what is the difference in Inserting as normal bson document and using FileStorage in terms of persisting the data to the disk.
Right now I have large C# object which I store it as FileStorage. I am thinking what may be the problem in that solution and is it a good idea.
Thanks
Hello here is again my small test program which is not working. Failign with following
Exception: Failed to create instance for type 'TestLiteDb.Program+Employee' from assembly
Stack Trace:
at LiteDB.Reflection.CreateInstance(Type type)
at LiteDB.BsonMapper.Deserialize(Type type, BsonValue value)
at LiteDB.BsonMapper.ToObject[T](BsonDocument doc)
at LiteDB.LiteCollection`1.d__27.MoveNext()
at TestLiteDb.Program.Main(String[] args) in d:\OneDrive\Edu\C#\TestLiteDb\TestLiteDb\Program.cs:line 54
at System.AppDomain._nExecuteAssembly(RuntimeAssembly assembly, String[] args)
at System.AppDomain.ExecuteAssembly(String assemblyFile, Evidence assemblySecurity, String[] args)
at Microsoft.VisualStudio.HostingProcess.HostProc.RunUsersAssembly()
at System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)
at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)
at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state)
at System.Threading.ThreadHelper.ThreadStart()
[Serializable]
public class Employee {
[BsonId]
public string EmployeeID { get; set; }
public string FirstName { get; set; }
public string LastName { get; set; }
public List<string> Objects { get; set; }
public byte[] Bytearray { get; set; }
}
using (var db = new LiteDatabase(@"filename=D:\LiteDb\MyData.db;journal=false")) {
// Get a collection (or create, if not exits)
var col = db.GetCollection<Employee>("customers");
var timer = new Stopwatch();
timer.Start();
foreach (var index in Enumerable.Range(1, 10000)) {
var emp = new Employee() {
EmployeeID = "wyu" + index,
FirstName = "one",
LastName = "two",
//Bytearray = new byte[1 * 1000 * 1000]
};
col.Insert(emp);
}
var de = col.FindAll();
foreach (var dd in de) { // Fails HERE
Console.Write(dd.EmployeeID);
}
timer.Stop();
Console.WriteLine("Time taken {0} ms", timer.ElapsedMilliseconds);
}
Console.ReadKey();
I've made a minimal unit test, which doesn't show this behaviour.
I suspect it may be related to the collection type?
Test method threw exception:
System.NotImplementedException: The method or operation is not implemented.
Result StackTrace:
at LiteDB.BinaryReaderExtensions.ReadBsonValue(BinaryReader reader, UInt16 length)
at LiteDB.IndexPage.ReadContent(BinaryReader reader)
at LiteDB.DiskService.<>c__DisplayClass11.<ReadPage>b__0() at LiteDB.DiskService.TryExec(TimeSpan timeout, Action action) at LiteDB.DiskService.ReadPage[T](UInt32 pageID) at LiteDB.PageService.GetPage[T](UInt32 pageID) at LiteDB.IndexService.GetNode(PageAddress address) at LiteDB.IndexService.<FindAll>d__2.MoveNext() at LiteDB.CollectionService.Drop(CollectionPage col) at LiteDB.LiteCollection
1.Drop()
at LiteDB.LiteDatabase.DropCollection(String name)
at Accord.Service.AccordService.Reset() in
snip
When serializing BSON, the first four bytes represent the length of the BSON document in total. This length should include the length bytes. While the BSON specification is vague on this, popular implementations, including MongoDB include these four bytes in the count.
For example, assuming our BSON document looks like this (Json representation):
{ "FirstName" : "John", "LastName" : "Smith" }
The MongoDB BSON serializer will produce the following bytes:
2d 00 00 00 02 46 69 72 73 74 4e 61 6d 65 00 05 00 00 00 4a 6f 68 6e 00 02 4c 61
73 74 4e 61 6d 65 00 06 00 00 00 53 6d 69 74 68 00 00
The LiteDB BSON serializer produces this:
28 00 00 00 02 46 69 72 73 74 4e 61 6d 65 00 05 00 00 00 4a 6f 68 6e 00 02 4c 61
73 74 4e 61 6d 65 00 06 00 00 00 53 6d 69 74 68 00 00
Notice that all the bytes are the same, except the first one. 2d versus 28. The MongoDB serializer includes the leading four bytes in the count while LiteDB does not.
The consequence is that the BSON between the two are not compatible, so serialization/deserialization between LiteDB and MongoDB fails when using BSON.
After doing spikes with NDatabase, RaptorDB, iBoxDB and LiteDB, I am strongly leaning toward using LiteDB as my lightweight, embedded document datastore of choice due to the easy to understand (and read) examples, intuitive API and Linq query functionality via visitor pattern. Nice library!
The main rub for me at the moment is the imposition that I must change my domain entities to have a property named "ID", that ends with "ID" or that has a BsonIdAttribute. I do think that it is a nice convention to have since a lot of entities do have an ID field. Likewise, the attribute is also a nice option for some. However, I do not like the design of my domain entities to be driven by third party frameworks, and I also prefer not to litter them with attributes from 3rd party namespaces / attributes.
So I think there really needs to be a way for a user to designate a unique identifier field via the API. Ideally, this could be configured on the Collection object right alongside any calls to "EnsureIndex".
I went ahead and took a quick stab at this and got it working for my purposes. The change is very simple:
In BsonSerializer.cs, change the _cacheId field to protected:
internal static Dictionary<Type, PropertyInfo> _cacheId = new Dictionary<Type, PropertyInfo>();
In Collection.Index.cs, add the following API method:
/// <summary>
/// Sets the designated property to act as the primary key for this entity.
/// </summary>
/// <typeparam name="K"></typeparam>
/// <param name="property"></param>
public void SetPrimaryKey<K>(Expression<Func<T, K>> property)
{
var type = typeof(T);
var p = QueryVisitor.GetProperty<T, K>(property);
lock (BsonSerializer._cacheId)
{
BsonSerializer._cacheId[type] = p;
}
}
NOTE: Method could alternatively be called "SetUniqueIndex", or whatever.
Example of calling this:
using (var db = CreateDB())
{
var collection = GetCollection<Domain.Document>(db);
collection.SetPrimaryKey(d => d.Title);
collection.EnsureIndex(d => d.Author);
}
(It might also be nice to store an array of PropertyInfo objects so as to allow composite keys.)
Thanks,
Jordan
I have a model with an interface property, and seem to be able to store my models OK (at least, I don't get any errors, and the file size looks about right).
But when I try to read any of these models back from the collection, I get Invalid type owner for DynamicMethod.
Hello,
I compared three versions of a test application. I saved a lot (10^7) objects of a type with three properties of type double.
The first version was coded with saving to a file with json serialization (Newtonsoft Json). The second was with litedb. The third with simply saving to a txt-File (File.WriteAllLines).
Code LiteDB:
// Open database (or create if not exits)
using(var db = new LiteDatabase(@".\LiteDB.db"))
{
// Get customer collection
var col = db.GetCollection<Node>("nodes");
col.Insert(nodes);
}
The results are:
Version | Memory Usage | Disk Space Usage | Time Needed [ms] |
---|---|---|---|
Newtonsoft Json | ~500MB | 673MB | 68,630ms |
LiteDB | ~6 675MB | 3 476MB | 497,464ms |
Text File | ~500MB | 950MB | 38,913ms |
Is there a faster way of saving bigger datas?
It's not really an issue but I think you should implement a shrink/compact method because after delete operation database file is always in the same file size.
Regards
I add file set to LiteDB.
For example:
this file can be added successfully
F:\monitor\index.html
But this gives me an error about invalid fileid:
F:\monitor\css\background.png
Why? When I comment this throwing exception it works ok.
How to avoid this kind of errors?
My understanding is that this excludes using interface types for properties.
Is there a specific reason you hardcode it, or could it be made configurable?
In my comparison of small nosql database, one difference I found is that LiteDB requires an ID field, whereas some of the others (like NDatabase) do not.
For many cases I can see why an ID would be required.
However, consider my current scenario where I want to store a list of records that I will never need to pull individual via a unique ID field:
The properties for my entity are "ProjectURL", "Title", and "Filename".
In this case, none of these fields are unique identifier candidate, but that should be OK because my use case dictates that I will only ever need to query multiple record results via the "ProjectURL" property. So I should be able to call "EnsureIndex(x => x.ProjectURL)" and be done.
However, LiteDB throws an exception because I do not have an ID!
Now I know that I could work around this problem by adding a GUID ID property that is generated in the entity constructor, but I would prefer to not let a 3rd party library drive my domain design.
So my question is this: Is it absolutely necessary for LiteDB to REQUIRE an ID / PrimaryKey field at all? Could this be made to be optional? If it were optional, then you could move the check and exception into the "FindById" function (since this function obviously depends on the fact that an ID has been specified previously).
Thanks,
Jordan
Hi David!
LiteDB is great job!
When I tried to run the unit tests, there was a failed test inside the Bson_Test() method. At calling Assert.AreEqual(d["minDate"].AsDateTime, DateTime.MinValue) the test result was "Assert.AreEqual failed. Expected:<0001.01.01. 1:00:00>. Actual:<0001.01.01. 0:00:00>." Before the serialization the value of d["minDate"] was "0001.01.01. 0:00:00", but after the deserialization it was "0001.01.01. 1:00:00" (one hour different).
Thanks.
I have a winform program use litedb,it work in win7 ,but in xp it work failure.
I use litedb shell open db file,in win7 work correct.in xp it have error
open message.db
db.JCYB.find
Oject reference not set an instance of an object.
in my program log the error stack is follow:
System.NullReferenceException: Object reference not set to an instance of an object.
at LiteDB.IndexNode.IsHeadTail(CollectionIndex index)
at LiteDB.IndexService.d__2.MoveNext()
at LiteDB.LiteCollection1.EnsureIndex(String field, IndexOptions options) at LiteDB.LiteCollection
1.EnsureIndex(String field, Boolean unique)
at LiteDB.Query.Run[T](LiteCollection1 collection) at LiteDB.LiteCollection
1.d__9.MoveNext()
at System.Linq.Enumerable.d__4d1.MoveNext() at System.Linq.Enumerable.<TakeIterator>d__3a
1.MoveNext()
at System.Linq.Buffer1..ctor(IEnumerable
1 source)
at System.Linq.OrderedEnumerable1.<GetEnumerator>d__0.MoveNext() at System.Collections.Generic.List
1..ctor(IEnumerable`1 collection)
When to call
using (var db = new LiteDatabase(SettingsManager.DbFilePath))
an error occurs
A first chance exception of type 'System.IO.FileNotFoundException' occurred in mscorlib.dll
but despite that, everything works
Hi Mauricio,
Thanks for this library, I'm enjoying using it at the moment. We're looking at using MongoDB as our main database system, while also using another embedded solution (maybe LiteDB) for the desktop version of our software.
Do you think it's possible for LiteDB to use the same ObjectId as MongoDB, so that we can re-use one C# class for both storage types?
Thanks,
Mark.
Hi,
I have a small example which fails
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Runtime.Serialization;
using System.Runtime.Serialization.Formatters.Binary;
using LiteDB;
namespace TestLiteDb {
class Program {
[Serializable]
public class Employee {
public int EmployeeID { get; set; }
public string FirstName { get; set; }
public string LastName { get; set; }
public List<string> Objects { get; set; }
}
static void Main(string[] args) {
using (var db = new LiteDatabase(@"filename=E:\Work\LiteDb\MyData.db;journal=false")) {
// Get a collection (or create, if not exits)
var col = db.GetCollection<Employee>("customers");
var emp = new Employee() {
EmployeeID = 1,
FirstName = "one",
LastName = "two",
};
var listofString = Enumerable.Range(1, 100000).Select(ind => "String" + ind).ToList();
emp.Objects = listofString;
var s = SerializeToStream(emp);
db.FileStorage.Upload("something", s);
var exist = db.FileStorage.Exists("something"); // Finds correctly
LiteFileInfo file = db.FileStorage.FindById("something"); // This one fails
MemoryStream stream = new MemoryStream();
file.CopyTo(stream);
var b = (Employee)DeserializeFromStream(stream);
}
}
public static MemoryStream SerializeToStream(object o) {
MemoryStream stream = new MemoryStream();
IFormatter formatter = new BinaryFormatter();
formatter.Serialize(stream, o);
return stream;
}
public static object DeserializeFromStream(MemoryStream stream) {
IFormatter formatter = new BinaryFormatter();
stream.Seek(0, SeekOrigin.Begin);
object o = formatter.Deserialize(stream);
return o;
}
}
}
So basically the content Length is 0.
And I see there is no unit test testing this as well.
Thanks
Hi,
I wrote an application that can send updates from editor to client. i want implement an db check organism in the client that check db last modify date before install it. can i use db file modified date for this kind of work?
or How can I use global update trigger in the editor for each change like these:
public class AppData : LiteDatabase {
public AppData()
: base(DataStore.ConnectionString) { }
protected override void Commit() {
var cl = this.GetCollection<DbInfo>("db");
var dbi = cl.FindById(0);
dbi.ModifiedOn = DateTime.UtcNow;
cl.Update(area);
base.Commit()
}
}
English isn’t my first language, so please excuse any mistakes.
FileStorage.Upload("Core.dll",@"E:\Core.dll");
if (FileStorage.Exists("Core.dll")) //return true
FileStorage.Delete("Core.dll");//return false
FileStorage.Delete( "Core.dll".ToLower() );//return true
Delete must send lower case?why?
Hello everyone!
I will use this issue to post some features/ideas that I'm thinking to implement in LiteDB. If you have new ideas, lets talk about that here. If you want, you can just +1 on some feature that are you interested just to I know 😄
Looking at the source code, it looks like it's now mandatory to have the BsonId
attribute on the Id property, even if it's named Id
. Is this intended?
I'm using the following code to upload a file using FileStorage.Upload to store a file within an ASP.NET request:
public string UploadFile(string mimeType, string origFileName, string userName, byte[] data)
{
string fileName = string.Format("{0}.{1}", Guid.NewGuid(), Path.GetExtension(origFileName).Replace(" ", "").Trim(' ', '.'));
string filePath = string.Format("$/{0}/{1}", userName, fileName);
using (var db = new LiteDatabase(BlobDBName))
{
LiteFileInfo finfo = new LiteFileInfo(filePath);
finfo.MimeType = mimeType;
finfo.Filename = origFileName;
using (MemoryStream ms = new MemoryStream(data))
{
ms.Seek(0, SeekOrigin.Begin);
db.FileStorage.Upload(finfo, ms);
}
return fileName;
}
}
The web page uses a control called "DropZone" to asynchronously upload multiple files at a time by calling into a web endpoint. My endpoint uses the above code to store the file. When I am uploading multiple files (and also retrieving them), I'm getting a number of strange errors.
The most common one is an access denied error against the journal file, but I also get "IOException: The Segment is already unlocked". These are bothersome, but not fatal. The bigger issue is that a number of files will be recorded in the _files collection with a zero length. I've verified the associated chucks exist and are properly recorded, but the file length is never updated. I am using the latest NuGet package (1.0). Any idea on what may be causing this?
FYI: My retrieval code looks like this:
public FileInformation GetFile(string userName, string fileName)
{
using (var db = new LiteDatabase(BlobDBName))
{
string filePath = string.Format("$/{0}/{1}", userName, fileName);
var liteFileInfo = db.FileStorage.FindById(filePath);
if (liteFileInfo == null)
return null;
FileInformation ret = new FileInformation();
ret.SizeInBytes = liteFileInfo.Length;
ret.MimeType = liteFileInfo.MimeType;
ret.Extension = Path.GetExtension(liteFileInfo.Filename).Trim('.');
ret.OriginalFileName = liteFileInfo.Filename;
ret.FileName = fileName;
using (MemoryStream ms = new MemoryStream())
{
liteFileInfo.CopyTo(ms);
ret.data = ms.ToArray();
}
return ret;
}
}
I am using litedb 1.0.1 via nuget, and when doing col.FindAll(), keep getting the following exception.
Could not load file or assembly 'query_yoxwar, Version=0.0.0.0, Culture=neutral, PublicKeyToken=null' or one of its dependencies. The system cannot find the file specified.
When running the same FindAll() in Linqpad, I get..
[A]Customer cannot be cast to [B]Customer. Type A originates from 'query_yoxwar, Version=0.0.0.0, Culture=neutral, PublicKeyToken=null' ...
Inserting worked just fine.
Any help is appreciated.
Hi : Mauricio
thanks for your project first .
i have a doubt that why there is no order in Find
IEnumerable<T> Find(Query query, int skip = 0, int limit = int.MaxValue)
for example , i want take latest 100 docs , i have a field "AddedTime"
the normal way is order by AddedTime desc then skip 0 and take(limit) 100 ,that's what i needed .
but without order ,the take is no sence .
what should i do ?
like it's data context driver
在windows .net framework 3.5 4.0 4.5 上都能正常运行,能不能能考虑支持mono 和 xamarin?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.