Quantcast

Maximum Database size tested

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Maximum Database size tested

Florbela Tique Aires Viegas

Hi,
I'd like to make a test with a large amount
of data in the order of Terabytes.
What is the maximum database size that has been
tested on this software?
Do I get a way to specify different "db.dat" files
like Oracle's datafiles, or am I constrained by the
maximum file size in a system?
Thanks,

Florbela


Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Maximum Database size tested

Zelaine Fong-3
We've done testing with TPCH-10G.  Currently, there's no way to specify
multiple db.dat files, so yes, you're constrained by the file system size.

-- Zelaine

Florbela Tique Aires Viegas wrote:

> Hi,
> I'd like to make a test with a large amount
> of data in the order of Terabytes.
> What is the maximum database size that has been
> tested on this software?
> Do I get a way to specify different "db.dat" files
> like Oracle's datafiles, or am I constrained by the
> maximum file size in a system?
> Thanks,
>
> Florbela
>
> -------------------------------------------------------------------------
> This SF.net email is sponsored by: Microsoft
> Defy all challenges. Microsoft(R) Visual Studio 2005.
> http://clk.atdmt.com/MRT/go/vse0120000070mrt/direct/01/
> _______________________________________________
> luciddb-users mailing list
> [hidden email]
> https://lists.sourceforge.net/lists/listinfo/luciddb-users



Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Maximum Database size tested

John Sichi
Administrator
Some other users have been successfully testing out larger sizes:

http://sourceforge.net/mailarchive/forum.php?thread_name=4697E5BD.3070606%40refractions.net&forum_name=luciddb-users

A good way to approach this is to test incrementally instead of going
straight to terabyte scale.  For example, first load a very small
dataset with some sample data just to make sure that the schema is
correct, nothing strange is happening with datatype conversions, etc.
Then load 1GB, then 10GB, then 100GB, then the full scale.

After each increment, run data validation queries and make sure that the
results match your expectations for the loaded data.

Also note that LucidDB's data compression algorithms mean that the
storage size required may be significantly smaller than what you'd need
with a traditional row-store DBMS; the incremental testing should allow
you to estimate the ratio.

Let us know how it goes.

JVS

Zelaine Fong wrote:

> We've done testing with TPCH-10G.  Currently, there's no way to specify
> multiple db.dat files, so yes, you're constrained by the file system size.
>
> -- Zelaine
>
> Florbela Tique Aires Viegas wrote:
>> Hi,
>> I'd like to make a test with a large amount
>> of data in the order of Terabytes.
>> What is the maximum database size that has been
>> tested on this software?
>> Do I get a way to specify different "db.dat" files
>> like Oracle's datafiles, or am I constrained by the
>> maximum file size in a system?
>> Thanks,
>>
>> Florbela
>>
>> -------------------------------------------------------------------------
>> This SF.net email is sponsored by: Microsoft
>> Defy all challenges. Microsoft(R) Visual Studio 2005.
>> http://clk.atdmt.com/MRT/go/vse0120000070mrt/direct/01/
>> _______________________________________________
>> luciddb-users mailing list
>> [hidden email]
>> https://lists.sourceforge.net/lists/listinfo/luciddb-users
>
>
> -------------------------------------------------------------------------
> This SF.net email is sponsored by: Microsoft
> Defy all challenges. Microsoft(R) Visual Studio 2005.
> http://clk.atdmt.com/MRT/go/vse0120000070mrt/direct/01/
> _______________________________________________
> luciddb-users mailing list
> [hidden email]
> https://lists.sourceforge.net/lists/listinfo/luciddb-users
>



Loading...