Mathematical modeling planning problems

Can be divided into: according to whether or not linear can be divided into linear planning and nonlinear planning, once is linear, the other is nonlinear, according to whether or not part of the process stage of dynamic planning and non-dynamic planning, according to the number of objective function points, can be divided into single-objective planning and multi-objective planning .

Linear and nonlinear are more common, I'll talk about the others.

Dynamic programming (dynamic programming) is an important branch of operations research, it is an effective quantitative method to solve multi-stage decision-making problems. Dynamic programming was founded by American scholar R. Bellman and others.In 1951, Bellman firstly put forward the optimization principle of solving multi-stage decision-making problems in dynamic programming, and gave the solutions to many practical problems.In 1957, Bellman published the book "Dynamic Programming", which marked the birth of this important branch of operations research.

Dynamic planning from the creation of more than 50 years now, whether in engineering technology, business management or in industrial and agricultural production and military and other sectors have a wide range of applications, and has achieved remarkable results. In management, dynamic programming can be used for resource allocation problems, shortest path problems, inventory problems, backpack problems, equipment renewal problems, optimal control problems and so on. Therefore, dynamic programming is an indispensable tool for scientific decision-making in modern management.

The advantage of dynamic programming is that it transforms a multidimensional decision-making problem into a number of one-dimensional optimization (optimization) problem, and the one-dimensional optimization problem one by one to solve. This approach is not possible with many extremum methods, and it outperforms almost all existing optimization methods. In addition, the ability of dynamic programming to find the global maximum or minimum is also superior to other optimization methods. It should be noted that dynamic programming is a method for solving optimization problems, a way to solve problems, not a new algorithm. In the previous we have learned to use simplex to solve linear programming problems, all the mathematical models that have the unity of the linear programming problem can be solved by simplex method, but there is no unified method for solving dynamic programming problems (similar to the simplex method). Therefore, in solving optimization problems by dynamic programming, it is necessary to analyze specific problems specifically, and for different problems, use the optimization principle (optimization principle) and methods of dynamic programming to establish their corresponding mathematical models, and then use dynamic programming methods to solve. According to these characteristics of dynamic programming, we are required to learn the basic principles and methods of dynamic programming at the same time, should also have a rich imagination, only in this way can build a good model to find the optimal solution to the problem.

According to whether the time variable is discrete or continuous, the model of the dynamic planning problem can be divided into discrete decision-making process and continuous decision-making process, according to the evolution of the decision-making process is deterministic or stochastic, the model of the dynamic planning problem can be divided into deterministic decision-making process and stochastic decision-making process, that is, the four kinds of decision-making process model, namely, the discrete deterministic, the discrete stochastic, the continuous deterministic, the continuous stochastic. process models. We focus on the discrete deterministic model.

2. Stochastic and fuzzy planning are two major mathematical planning tools for dealing with stochastic and fuzzy optimization problems called uncertainty planning. The main purpose is to lay a foundation for the theory of optimization in uncertain environments. The theory of uncertainty planning consists of three main categories: expected value modeling, chance-constrained planning, and related chance planning.

3. The concept of stochastic planning is relatively rare

You can refer to the branch of operations research

Mathematical planning is the object of study is the planning and management work related to the arrangement and valuation of the problem, to solve the main problem is in the given conditions, according to a certain measure to find the optimal arrangement of the program. It can be expressed as a function to meet the constraints of the great and small value of the problem.

Mathematical planning and the classical problem of finding extreme values are fundamentally different, the classical approach can only deal with the situation with simple expressions, and simple constraints. In contrast, modern problems in mathematical programming have complex objective functions and constraints, and require numerical solutions with some degree of accuracy, so the study of algorithms has received particular attention.

The simplest kind of problem here is linear programming. If the constraints and the objective function are linearly related, it is called linear programming. To solve the linear programming problem, theoretically speaking, we have to solve the system of linear equations, so the method of solving the system of linear equations, as well as the knowledge of determinants, matrices, is a very necessary tool in linear programming.

The emergence of linear programming and its solution method - the simplex method - has played a major role in promoting the development of operations research. Many practical problems can be transformed into linear programming to solve, and the simplex method has a proven algorithm, coupled with the emergence of computers, so that some of the large and complex practical problem solving has become a reality.

Non-linear planning is the further development and continuation of linear planning. Many practical problems such as design problems, economic equilibrium problems belong to the category of nonlinear planning. Nonlinear planning to expand the scope of application of mathematical planning, but also to the mathematical workers raised many basic theoretical issues, so that the mathematical such as convex analysis, numerical analysis, etc. have also been developed. There is also a planning problem and time-related, called "dynamic planning". In recent years, it has become an important and frequently used tool for optimal control problems in engineering control, technical physics and communications.

Queuing theory is another branch of operations research, which is called the theory of stochastic service systems. The purpose of its research is to answer the question of how to improve the service agency or organization to be served, so that a certain indicator to reach the optimum. For example, how many terminals should be in a harbor, how many maintenance personnel should be in a factory, and so on.

Queuing theory was initially started in the early twentieth century by the Danish engineer Erlang on the efficiency of telephone exchanges, in the Second World War in order to estimate the capacity of the runway of the airfield, it was further developed, and its corresponding disciplines of updating theory, reliability theory and so on have also been developed.

Because queuing is a stochastic phenomenon, probability theory, which studies stochastic phenomena, is used as the main tool in the study of queuing. In addition, there are differential and differential equations. Queuing theory visualizes the object it seeks to study as a customer arriving at a service desk and asking to be greeted. If the service desk is occupied by other customers, then there is a queue. On the other hand, the service desk is sometimes free and sometimes busy. It is necessary to mathematically find the probability distribution of the customer's waiting time, queue length, and so on.

Queuing theory is quite widely used in daily life, such as the regulation of reservoir water volume, the arrangement of the production line, the scheduling of the railroad divided into yards, the design of the power grid and so on.

The theory of countermeasures is also called game theory, and the Tian Ji horse race mentioned earlier is a typical game theory problem. As a branch of operations research, the development of game theory is only a few decades of history. The mathematician who systematically created this discipline is now generally recognized as a Hungarian-American mathematician, the father of computers - von Neumann.

The first mathematical study of game theory began in chess - how to determine the winning move. Because it is the study of the conflict between the two sides, the problem of winning countermeasures, so this discipline has a very important application in the military. In recent years, mathematicians have also studied the problems of combat and tracking between mines and ships, fighters and bombers, and proposed a mathematical theory in which both sides of the pursuit can make autonomous decisions. In recent years, with the further development of artificial intelligence research, more new requirements for game theory.

Search theory is a branch of operations research that emerged as a result of the needs of war in the Second World War. It mainly studies the theory and method of how to design the optimal program to find a certain target and implement it under the constraints of resources and means of detection. It arose in World War II when the Allied air forces and navies were studying how to screen against Axis submarine activities, fleet transportation, and troop deployments. The search theory has also achieved a lot of results in practical application, for example, in the 1960s, the United States searched for the missing nuclear submarines in the Atlantic Ocean, the "thresher" and "scorpion", as well as the Mediterranean Sea to find the lost hydrogen bombs, are based on the theory of search to achieve success.

The search theory was the basis for the success of the search.

Operations research has a wide range of applications, and has penetrated such areas as service, inventory, search, population, confrontation, control, scheduling, resource allocation, site location, energy, design, production, reliability, and so on.

Should queueing theory and stochastic planning are closer

I hope you ask a professional instructor for more specifics

I hope this helps