ATTENTIONThis FlexSim Community Forum is read-only. Please post any new questions, ideas, or discussions to our new community (we call it Answers) at https://answers.flexsim.com/. Our new Question & Answer site brings a modern, mobile-friendly interface and more focus on getting answers quickly. There are a few differences between how our new Q&A community works vs. a classic, threaded-conversation-style forum like the one below, so be sure to read our Answers Best Practices. |
flexsim.com |
#1
|
|||
|
|||
Tracking into global table - Experimenter
Hi all!
I have a question concerning the experimenter tool and recording stats: I deal with a two stage demand-driven batch production system that faces 100 periods (period length: 3600). I want to look at several scenarios where I vary the batch sizes on the different echelons. I have a rather complicated set of labels and variables to handle this batch production process and the experimenter is a very helpful tool to vary these input parameters. Questions: 1. My system needs to start non-empty. So the object "SourceInit" provides the queues with initial items. I want it to do this each time a new scenario (new batch combination) starts. Can I handle this by indicating the initial amount of items (Arrival Sequence - Quantity x) as a decision variable in the Experimentor? I tried this in a smaller model but I did not find any difference. I cannot say if it worked out or not. 2. My goal is to track the flow time of the item. I managed to write this data into a global table. This works nice for one scenario but how can I track the flow times of a new SERIES of items (new scenario!) in the same global table successively? Each item that passes through the system has a unique id (item label "Stueck"). This would make it easy to distinguish the scenarios from one another as they are consecutively ranked in the global table. 3. Scenario 2 that I have defined starts as it should start but doesn't execute my UserEvent (Demand query). It stops my demand process. Maybe it has something to do with initial items that do not correctly enter the system? I know, the very last qustion will be rather difficult to solve if you are not deep enough in the model but maybe you have a clue... Thank you so much for your kind advices! All the best! Simon |
#2
|
||||
|
||||
I was just looking for threads of similar topic.For your second question i hope this thread will help you
http://www.flexsim.com/community/for...read.php?t=747 .For the remaining questions any other expert may help you |
The Following User Says Thank You to sagar bolisetti For This Useful Post: | ||
Simon Jutz (04-21-2014) |
#3
|
||||
|
||||
Dear Simon,
Please see the attached model for your first Question. Regards, Arun KR |
The Following User Says Thank You to arunkrmahadeva For This Useful Post: | ||
Simon Jutz (04-21-2014) |
#4
|
|||
|
|||
Thank you very much for your comments! I think I found a way for question one and two. Will give you an update on this issue asap.
I found the mistake and hopefully the answer to question 3: There is a problem with the backorders (backlogs). If there are any, they should get accumulated and not deleted. Each period the demand cannot be fully fullfilled the label "DemandDay" in the Sink does not decrease to zero. This is ok since I want to account for the possible backlogs. But once a new period starts and a new message from the UserEvent SETS (setlablenum...) the label to the actual demand and doesn't account for the backlog from the previous period (it simply disappears). Now the simple question: How can I UPDATE instead of SETTING a label? I tried it with "inc(label....)" but it doesn't work. But I am quite sure that there must be a very simple answer to this point and therefore an answer to backlogs which is a highly important topic too. Attached you find a much simpler and small model with the same logic. Once I have solved my issues I will publish them here - really appreciate your help! All the best, Simon Last edited by Simon Jutz; 04-22-2014 at 07:00 AM. |
The Following User Says Thank You to Simon Jutz For This Useful Post: | ||
Jörg Vogel (04-22-2014) |
#5
|
||||
|
||||
Hello Simon,
as far as I have tested there is a change in the methods of creating results from different scenarios and replications in Flexsim7 to older versions like Flexsim5 and before. In Flexsim5 you had had the ability to write your results as a global table. In Flexsim7 you use in the first step the performance measures. At the next step you show their results as raw data. Maybe it is a table. At step 3 you explore the table as a tree. You explore the data node as a table again, that is step 4. This table can you let show in a dashboard. That is quite easy with the red pin in the upper menu bar. At step 6 you can export the shown dashboard table as a csv-file to the disk. Otherwise you can already use at step 2 the excel importer / exporter tool to establish a permanent model link to the results of the performance measure under the model tree. The advanced triggers of the experimenter behave different to the previous flexsim versions. Unkowingly the trigger End of Replication can not get the values of statistical results such as input, output or time of the replication. The results seem to been reseted already, when the trigger is fired. Jörg |
The Following 2 Users Say Thank You to Jörg Vogel For This Useful Post: | ||
Simon Jutz (04-22-2014) |
#6
|
|||
|
|||
Hello Jörg,
thanks for the comments on the difference between Flexsim 7 and older versions. Personally I am still working with Flexsim 4 and hope to change to version 7 soon. For now I developed a code in Tools\Experimenter\End of Scenario to give me several csv files with the respective scenarios - works perfectly! Here is the code: /**Setting global table*/ double replication = parval(1); double scenario = parval(2); treenode EventTime=node("/Tools/GlobalTables/EventTime",model()); string filename="EventTime"; filename = concat(EventTime , "_S" , numtostring(scenario,0,0) , ".csv"); string directory = modeldir(); if(stringlen(directory) < 3) directory = documentsdir(); filename=concat(directory,filename); exporttable(node(">variables/data",EventTime),filename,1,1); I am still stuck with this simple question: There is a problem with the backorders (backlogs) in my model. If there are any, they should get accumulated and not deleted. Each period the demand cannot be fully fullfilled the label "DemandDay" in the Sink does not decrease to zero. This is ok since I want to account for the possible backlogs. But once a new period starts and a new message from the UserEvent SETS (setlablenum...) the label to the actual demand and doesn't account for the backlog from the previous period (it simply disappears). Now the simple question: How can I UPDATE instead of SETTING a label? I tried it with "inc(label....)" but it doesn't work. But I am quite sure that there must be a very simple answer to this point and therefore an answer to backlogs which is a highly important topic too. Maybe you or some other expert can help me on that point. I search the Forum on this - but I couldn't find any hints Thank you so much! Simon |
#7
|
||||
|
||||
Hello Simon,
you can read the existing label value into a local variable before you set the label to the sent message parameter. If you accumulate the local variable with the sent parameter you can set the label anew, too. That should work with incrementing a label, too. The command works on a number node, even on a number label: inc(label(current, "your label name"), num increment value); Jörg |
The Following User Says Thank You to Jörg Vogel For This Useful Post: | ||
Simon Jutz (04-23-2014) |
#8
|
|||
|
|||
Hi everybody!
Again, thank you very much for your comments! I managed to solve all my outlined problems! Now I am stuck with a quite general question which is related to the Experimenter issue too: As I already introduced I am dealing with a batch processing facility that operates on two sequential manufacturing stages. I investigated several (195) scenarios, where I simulated one batch size combination in one scenario and recorded the flow time per item. They output FlexSim gives me is first written into a global table and this table is converted into a csv file after each scenario. I was quite suprised how long it took to simulated these 195 scenarios (almost 72h on several PCs). The input parameters were all deterministic (no stochstics considered!) and what I basically did is recording flow times - so no complicated optimization procedure in the background, etc. My question: Is it normal normal to deal with these run times? Even if I simply record flow times of items passing through a system? Maybe I programmed a very cumbersome way to make the model doin what I wanted it to do?! Intersting finding I made: The more scenarios I gave one PC, the more time the single scenarios required to finish (almost exponential relationship). Is that ok? Such long runtimes without any stochastics? Any complicated scenarios? Perhaps the logic of the experimenter is programmable in a User Comand, but wouldn't this inflate the runtime even more? Thanks for any comments on that one! I know that its not easy to answer it properly! All the best, Simon |
#9
|
||||
|
||||
Simon,
Most of what you are experiencing is due to the old version you are using. In V7, the experimenter runs as many replications simultaneously as CPU cores are available. I do not exactly remember anymore, why the experimenter turned out to be slow in V4... However, statistical behavior does not really have an impact in simulation run length. Are you writing to excel while running your simulation? That would be a real performance breaker... The good news is, your 7.1 license including OptQuest has been delivered yesterday Good luck Ralf FlexSim |
The Following 2 Users Say Thank You to RalfGruber For This Useful Post: | ||
Simon Jutz (05-07-2014) |
#10
|
|||
|
|||
Hi Ralf!
Thank you for your answer! Yes, I m really glad to have V7 now. Hope to transform my model soon to the new version. Yes, in my actual model I want the experimenter to write the output of one scenario FROM a global table TO an Excel file. The reason for doining this is because it is more convenient afterwards to distinguish between the different scenarios. Is runtime decreasing aswell in V7 if I use the same procedure? Wrting csv files during the entire simulation process? Thanks! Simon |
#11
|
||||
|
||||
Simon,
yes, this is due to the communication technology of Windows. It is good practice to collect all results internally first and then, at the end of the simulation run, to transfer all data in one operation to Excel. Hope that helps. Ralf FlexSim |
The Following User Says Thank You to RalfGruber For This Useful Post: | ||
Jörg Vogel (05-07-2014) |
Thread | Thread Starter | Forum | Replies | Last Post |
Global table | Rick Colemann | Flexsim Student Forum | 4 | 04-17-2014 01:51 AM |
Global Table on GUI | ameen shabeer | Q&A | 7 | 04-14-2014 01:50 AM |
Global Table export problem by using end of replication trigger in teh experimenter | Peppino | Q&A | 16 | 02-22-2013 04:38 AM |
Global Table string value | Congshi Wang | Q&A | 1 | 05-24-2010 07:38 AM |
Can I make a column of a global table to type table? | qin tian | Q&A | 0 | 10-01-2008 09:27 PM |