日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 编程资源 > 编程问答 >内容正文

编程问答

[NOTE in progress] Simulation Optimization

發布時間:2023/12/14 编程问答 36 豆豆
生活随笔 收集整理的這篇文章主要介紹了 [NOTE in progress] Simulation Optimization 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

簡單記錄一下關于仿真優化的一些知識點和思考。主要基于:Handbook of Simulation Optimization, Michael Fu

Table of Contents

Overview

Discrete Optimization

Three fundamental type of errors:

Optimality Conditions

Different scenarios depending on the solution space size:

Ranking and Selection

Ordinal Optimization (OO)

Globally Convergent Adaptive Random Search

Locally Convergent Adaptive Random Search

Commercial Solvers


Overview

這是本書的overview 實際上也可以看做是這一field的overview.

  • SimuOpt : optimize, when the obj function ?cannot be computed directly, but can be simulated, with noise (focus on stochastic simulation environment).

一種分類方式:Discrete vs Continuous

  • Discrete Optimization
    • Solution space is small -> Ranking & Selection (based on statistics or simulation budget allocation)
    • Solution space is large be finite -> Ordinal Optimization (no need to estimate accurately every candidate, only need to know their?order. Much faster convergence (exponential))
    • Solution space is countably infinite -> Random Search (globally or locally convergent)
  • Continuous Opt
    • RSM (Response Surface Methodology). Also has constraint considerations and robust variants
    • Stochastic Approximation (RM, KW, simutaneous perturbation stochastic approximation for high-dim pbs)
    • SAA (Sample Average Approximation) with consideration on stochastic constraints.?
    • Random Search, focus on estimation and on the search procedure. Model-based RS is newer class, assuming probability matrix is known.

Since stochasticity is the keyword, some base knowledge is important for DO as well as for CO.

  • Statistics
    • How to estimate a solution
    • How to know soluiton x is better than y
    • How to know to what extent we are covering the optimal solution in the search
    • How many replications de we need...
    • Hypothesis testing
  • Stochastic constraints
  • Variance reduction
  • ...

Discrete Optimization

Three fundamental type of errors:

  • The optimial solution is never simulated (about search)
  • The opt that was?simulated is not?selected (about estimation)
  • The one?selected is not well estimated (about estimation)

Optimality Conditions

  • are needed to 1) ensure the correctness of the algo; 2) define the stopping criteria
  • for constrain free non-linear optimization, we stop at a settle point
  • for integer optimization, we check the gap between LB and UB
  • here for SBO, it's difficult because:
    • the cost of solution g(x) can only be estimated
    • no structural info can be used to prune solution zone
    • complete enumeration of the solution space is often computationally intractable

Different scenarios depending on the solution space size:

  • Small. Less than hundreds of candidate. The key is then how to well estimate all solutions and return the best. Practically we analyze the Probability of Selection Correctness. (PSC). Algo stops til??where x* is the selected best solution.
  • Large.
    • Impossible to simulation all candidates. The idea is then to find a "good enough" solution, which means that x* is among the t-best solutions, with a certain probability.. This is used in ordinal optimization
    • Or, choose methods with globally convergence () or locally convergence () guarantee.??is the set of all local optimums depending on the definition of neighborhood structure. Local optimum can be tested statistically by controling the type1 and type2 error. Because a neighborhood is often not large.
    • Hypothesis testing: if the hypothesis is right, what's the probabilty of our observation? This is a proof by contradiction, emphasizing the rejection instead of the acceptance.
    • (Meta)heuristics often found in commertial solvers. These algorithms work well for difficult deterministic integer programs, and they are somewhat tolerant of sampling variabilities. However, they typically do not satisfy any optimality conditions for DOvS problems and may be misled by sampling variabilities.

Ranking and Selection

Two formulations are concerned:?

  • indifferent zone formulation (IZF)
  • bayesian formulation (BF)

IZF (Frequentist)

Assume??which is the most difficult case. The objective is to find x1 which is at least?-better than all the others.

  • Bachhofer's procedure: assume estimation variance?, Bachhofer decides the number of replications to estimate each solution. Then it suffices to chooses the best one based on the sample mean.
  • Paulson's procedure: filter progressively. At each iteration: take one observation of each solution, calculate the sample mean, and filter out some bad solutions. This is more efficient than Bachhofer's since a large number of solutions may be filtered out at early stages.
  • Gupta's procedure (subset selection): similar to 1 and 2. Returning a set of solutions? and guarantee that?
  • Based on the principle of the?above 3 procedures, further procedures include:

  • NSGS: a two-stage procedure. Compute a initial sample mean, then according to the variance of estimations, decide the amount of extra replications to make. Finally select the best.
  • KN: contrast to NSGS, this is not a two-stage procedure but a iterative one, adding replications progressively.
  • BF (Bayesian) (Does not provide PSC guarantees)

    Used when prior information is available.

    Helps to choose the next solution to explore, based on prior information and previous sample results, and also the simulation budgets. This involves a MDP problem and can be possibly solved by ADP/RL.

  • Generic Bayes Procedure: basicly a RL procedure: Simulation (state)->Choose the next solution (action)->loop
  • Since it's hard to find the optimal Actor, some heuristics are proposed:
  • OCBA (Optimal Computing Budget Allocation)
  • EVI (Expected Value of Information)
  • KG (Knowledge Gradient)
  • Conclusion

    Brankle et al. found that no R&S procedure is dominent in all situations. (thousands of pb structures tested). BF is often more efficient in terms of nb of samples, but it doesn't provide correct-selection optimality guarantee like frequentist does.

    Ordinal Optimization (OO)

    When the solution space is large, OO proposes "sofe optimization", which selects a subset S from??and limit the analysis to S. We are interested in the probability that?, where T is the set of top t solutions in the whole space.??is called the alignment level and the probability is alignment probability.

    Two basic idea behind OO:

  • Estimating the order between solutions is much easier than estimating obj values
  • Acception good enough solutions leads to exponential reduction in computational burden
  • OO is more an analysis than new algorithm, the procedure will be:

  • First determine the AP (alignment probability)
  • Then that will determine the cardinality of subset?
  • Then just run R&S and you got the guarantee that the solution is among the top t.
  • In practical i don't think this is so interesting, since it just tells you that, the larger??is, the better.

    Globally Convergent Adaptive Random Search

    Designed for large but finite solution space. Guarantee?

    Generic GCARS:

  • init
  • Sampling:?
  • Estimation:
  • Iteration: update V(x) for x in?
  • Several algoritms are described:

  • Stochastic Ruler Algo: accept a solution by uniformly choosing a ruler u~U(lb,ub)
  • Stochastic Branch and Bound: each time choose a partition of??with the minimum LB, then partition it finer and finer
  • Nested Partition: an enhancement of SBB with less information to memorize
  • ?R-BEESE(Balanced Explorative and Exploitative Search with estimation). On each iteration:
  • with a probability q, refine the current x* with more replications
  • else with a probability p sample from Global(theta)
  • else?sample from Local(theta)
  • Locally Convergent Adaptive Random Search

    Similar to GCARC, but with a statistical procedure to test the local optimality of .

    COMPASS (Convergen Optimization via Most-Promising-Area Stochastic Search)

  • init, sample a neighborhood of solutions and retain the best
  • move to the next neighborhood by choosing?. In otherword, always focus on the closest neighbors of x*. Here a LP can be solved to find the neighborhood. Called constraint pruning.
  • ?

    AHA (Adaptive Hyperbox Algo)

    Like COMPASS, but define the neighborhood as the hyperbox around x* :??where d is the dimension of x.

    Commercial Solvers

    Most simulation modeling softwares includes SBO tool, but most of them are based on R&S or meta-heuristics like SA. Meta-heuristics have been observed to be effective on difficult deterministic optimization problems but they usually provide no performance guarantees. Some advises are:

  • Do preliminary tests to control sampling variability
  • Re-run several times the solver (multi-start with different random seed)
  • Estimating the final?solutions set carefully to be sure to select the best.
  • Conclusion

    Most of the above mentioned algorithms are black-box algorithms that do not depend on problem structures. This can be considered in defining the neighborhood in LCRS, for instance.

    ?

    ?

    ?

    總結

    以上是生活随笔為你收集整理的[NOTE in progress] Simulation Optimization的全部內容,希望文章能夠幫你解決所遇到的問題。

    如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。

    主站蜘蛛池模板: 涩涩涩涩涩涩涩涩涩 | 亚洲精品国产成人av在线 | 黑人vs亚洲人在线播放 | 日批视频免费 | 亚洲熟妇无码一区二区三区 | 欧美在线aa | 亚洲一区二区三区加勒比 | 日本一级片免费看 | 日韩av电影中文字幕 | 黄色三级视频网站 | 少妇精品无码一区二区免费视频 | 成人性生交大片免费看r链接 | 国产三级三级看三级 | 亚洲最新网址 | 波多野结衣1区2区3区 | 亚洲一区二区三区麻豆 | 欧美日本| 狠狠操你 | 男人午夜天堂 | 黄色网在线免费观看 | 国产老头户外野战xxxxx | 中文字幕二区在线观看 | 综合久久激情 | 国产女人18毛片水真多18 | 解开人妻的裙子猛烈进入 | 精品久久影视 | 日韩成年人视频 | 四虎影库永久在线 | 婷婷激情综合 | 操三八男人的天堂 | 91精品导航 | av电影一区二区 | 国产婷婷色一区二区在线观看 | 91精品婷婷国产综合久久竹菊 | 免费成人av在线播放 | 日日日噜噜噜 | 日本三级在线 | 国产精品地址 | 国产主播在线一区 | 日韩操比 | 黄色免费网页 | 欧美视频亚洲 | 五月婷婷俺也去 | 亚洲欧美激情小说另类 | 日韩中文字幕在线播放 | 超碰在线98 | 国产精品香蕉在线观看 | 精品国产人妻一区二区三区 | 女人av| 久久久久成人精品免费播放动漫 | 精品乱码一区二区三四区视频 | 天天干干天天 | 中文字幕人妻熟女在线 | 阿v免费在线观看 | 精品一区欧美 | 成人影视在线播放 | 成人一二三四区 | 99福利视频导航 | 草色噜噜噜av在线观看香蕉 | 一区二区三区四区在线 | 国产精品视频观看 | 黄频在线免费观看 | 黄色av免费| 亚洲国产片 | 国产中文字幕在线 | 亚洲精品偷拍 | 午夜美女福利 | 伊人久久国产精品 | 国产日韩91| 欧美视频在线观看免费 | 国产又粗又猛又黄又爽无遮挡 | 最新日韩在线视频 | 男男上床视频 | 日韩免费网站 | 免费成人深夜夜 | 永久免费看成人av的动态图 | 色偷偷影院| 波多野结衣三区 | 国产一区二区三区视频网站 | 色综合网址 | 中国免费看的片 | zzjizzji亚洲日本少妇 | 伊人草草 | 91爱爱.com| 黑人性视频 | 一级成人av | 国产黄在线免费观看 | 五月婷婷深爱 | 日本人三级 | 亚洲另类在线观看 | 国产夫妻在线观看 | 中文字幕av亚洲精品一部二部 | 91国产丝袜播放在线 | 三级福利片 | 亚洲 欧美 日韩 国产综合 在线 | 欧美cccc极品丰满hd | 国产日产欧洲无码视频 | 日韩国产欧美在线观看 | www.99re7.com |