Estimating AS2K5 Storage & Memory usage

MS SQL SERVER

Next

• 1. Aggregate Function in Cube Editor
I am trying to calculate the Standard Deviation for 200,000 Student scores in a fact table. When I create a measure using Cube Editor called 'MathScore' for the 200,000 Students, I cannot use the Stdev aggregate function. Nor can I create a student dimension for the 200,000 student members. I get the error that you cannot create a dimension with more than 64,000 members. If I could do that, I would be able to use MDX and call the Stdev function in my query. I don't want a dimension larger than 1000 members, but I have to get the standard deviation of 200,000 scores. How can I do this in a cube? thank you, tim
• 2. Lookup Table to Dimension
I have a question about creating a dimension from a lookup table that holds the lookup values for several fields in my fact table. FACT TABLE foreignkeyvalue1 (can hold values 1,2, or 3) foreignkeyvalue2 (can hold values 1,2, or 3) foreignkeyvalue3 (can hold values 1,2, or 3) LOOKUP TABLE 1 = Low 2 = Medium 3 = High Assuming that I want to keep the fact table flat, do I have to create 3 separate lookup tables to create 3 separate dimensions in a cube for the three foreign keys, or is there a way to create 3 separate dimensions for the three foreign keys off of the same lookup table? When I create a dimension off of the LOOKUP TABLE for foreignkeyvalue1, I am unable to get it to create another independent dimension off of the same LOOKUP TABLE for foreignkeyvalue2. ANYONE? tx, tim
• 3. A challenge design question
Fast food chain store (Think the likes of Burger King or McDonald's). Items are sold separately or combined. Need to know, for each item sold, what are the ratios that any other items are sold in the same transaction. For example, one day during 5-6PM, 200 Cokes were sold in 160 transactions (orders). 75 of them were part of the combo's. Within those 160 transactions, there also sold 140 cheese burgers, 130 French fries (85 of them were part of the combo's), . That is, in that time period, 200 Cokes were sold in transaction that also sold 85 French fries in combo, 45 French fries not in combo; 140 cheese burgers, etc. The data warehouse is to provide information like: at 8-9AM, 234 Fries were sold in transaction that also sold 48 Cokes and 93 Coffee; at 7-8PM 435 Fries were sold in transactions that also have 238 Cokes and 49 Diet Coke, and so on. Business calendar, store structure, and order size (how much per order) are other dimensions. It's like a crosstab query, plus drill up/down capabalities with date/time, store, and etc... There are hundreds of items in tens of categories. So, what the best way to design this? A fact table like: OrderDateTime DATETIME, StoreID INT, OrderID INT, ProductItemID INT, PartOfCombo BIT, ItemAmount INT Then how do I build cudes and queries to get the results? Help please! Thanks a lot! DNG
• 4. Cube corrupted?
I am using a vbscript to clone the partitions in a cube. And I found that the in the cube, I cannot use "for each m in dsoNewPart.Measures", it just return an empty one (strange) and quit the loop. The cube has 4 measures. However, I can use for J= 1 to 4 to access the Measures. Should it started from 0? Or I have a corrupted measure at subscript 0?

Estimating AS2K5 Storage & Memory usage

```Greetings all,

Is anyone aware of a documented method to estimate storage
requirements and memory usage for AS2K5? With AS2K, these used to be
roughly equivalent, and there was a formula for determining cube size.

I've searched for something equivalent, but haven't found anything
yet.

This is to answer the questions:
1) How much disk do you need?
2) How can we tell if the whole database will fit in the RAM on this
server?

I realize that AS2K5 doesn't need to load everything in RAM now, but
that are some circumstance where the possibility of doing so is still
desirable.

Thanks and regards,

Tom
```

```For an existing database is there an automated method of calculating
the storage saving of using varDecimal storage type?

```

```Hi i've got a database that allows users to store snapshots of data from
their devices i need to be able to limit the amount of space they use in the
database so that i can warn them and then delete data if they go over this
limit.

Is there a simple way in SQL to determine this based on the number of rows
used in tables? The problem starts when trying to determine how much space
is used for varchars etc.

I know SQL stores data in pages but it would be nice to be able to estimate
how much a space a certain user has taken up.

Any help would be greatly appreciated

Jonny

```

```Hi,

I'm running SQL Server 2000 on an NT Workstation with 1GB
of memory.  I've configured the maximum memory of the SQL
Server to be 512MB, but when I check Task Manager it shows
the Memory Usage of sqlservr.exe to be more than 512MB.
(Currently at 560MB). Is it normal that sqlservr.exe will
use more than the maximum allowed?

Alice

SQL Server Version I'm running:
Microsoft SQL Server  2000 - 8.00.534 (Intel X86)
Nov 19 2001 13:23:50
Personal Edition on Windows NT 4.0 (Build 1381:
Service Pack 6)

```

```Is there any way to estimate how much space will be needed by tempdb?
I have a user executing a simple query similar to:

select  orders.customerid,  sum(quantity*unitprice) as amount
from orders inner join [order details]
on orders.orderid=[order details].orderid
group by orders.customerid

In other words, two thin (not too many columns) tables, equi-join,
summarizing and a group by.  The only trouble is her two tables have
28000 rows and nearly 800 million rows.  Her query dies after a few
hours when tempdb autogrows and runs out of disk space at 17 gig.  I
know tempdb is used for work tables, joins, sorts, group bys, etc.
But knowing the columns sizes and number of rows, is there a way to
estimate how much tempdb will be needed?  She's basically the only
user on the system.

By the way, this is SQL Server 2000.
Thanks,
Scott
```

```2 instances SQL2005, 1 instance SQL2000.
We have set the max memory to 500MB on the SQL2005 and 512MB on the SQL2000.
the SQL 2000 maintains between 480MB and 500MB, but the memory keeps creeping
up daily past the 500MB Max setting on the SQL 2005 . Currently the Memory is
575MB and 573MB.
```