最適化アルゴリズムExample集: 非線形最小二乗法によるデータフィッティング
nAG数値計算ライブラリ > 最適化アルゴリズムExample集 > 非線形最小二乗法によるデータフィッティング

非線形最小二乗問題

概要:

このExampleは、非線形最小二乗法を用いてデータへのモデル関数のフィッティングを行っています。さらに、外れ値となるデータ点を除外したフィッティングも行い、モデルへの影響を確認しています。

目的関数:

タスク
Minimize i=1n(f(ti)yi)2

目的関数は、モデル関数 f(t) と観測データ y の残差二乗和です。

Minimizei=1n(x1ti2+x2ti+x3+x4sin(x5ti)yi)2

ここで、n はデータ点の数、tii 番目の観測点の t 座標、yii 番目の観測点の y 座標を表します。

決定変数:

変数 範囲
x1 (,)
x2 (,)
x3 (,)
x4 (,)
x5 (,)

決定変数は x1,x2,x3,x4,x5 の5つです。これらはモデル関数 f(t)=x1t2+x2t+x3+x4sin(x5t) のパラメータに対応しています。

制約条件:

このExampleでは、決定変数 x1,x2,x3,x4,x5 に明示的な制約条件は課されていません。ただし、最後のフィッティングでは x1=0.3 に固定されています。

x1=0.3

Exampleの実行コマンド:

python -m naginterfaces.library.examples.opt.handle_disable_ex

ソースコード表示コマンド:

python -c "import inspect; from naginterfaces.library.examples.opt import handle_disable_ex; print(''.join(inspect.getsourcelines(handle_disable_ex)[0]))"

出力結果例:

naginterfaces.library.opt.handle_disable Python Example Results
First solve the problem with the outliers
--------------------------------------------------------
 E04GG, Nonlinear least squares method for bound-constrained problems
 Status: converged, an optimal solution was found
 Value of the objective             1.05037E+00
 Norm of gradient                   8.78014E-06
 Norm of scaled gradient            6.05781E-06
 Norm of step                       1.47886E-01
 Primal variables:
   idx   Lower bound       Value       Upper bound
     1       -inf        3.61301E-01        inf
     2       -inf        9.10227E-01        inf
     3       -inf        3.42138E-03        inf
     4       -inf       -6.08965E+00        inf
     5       -inf        6.24881E-04        inf
Now remove the outlier residuals from the problem handle
 E04GG, Nonlinear least squares method for bound-constrained problems
 Status: converged, an optimal solution was found
 Value of the objective             5.96811E-02
 Norm of gradient                   1.19914E-06
 Norm of scaled gradient            3.47087E-06
 Norm of step                       3.49256E-06
 Primal variables:
   idx   Lower bound       Value       Upper bound
     1       -inf        3.53888E-01        inf
     2       -inf        1.06575E+00        inf
     3       -inf        1.91383E-03        inf
     4       -inf        2.17299E-01        inf
     5       -inf        5.17660E+00        inf
--------------------------------------------------------
Assuming the outliers points are measured again
we can enable the residuals and adjust the values
--------------------------------------------------------
 E04GG, Nonlinear least squares method for bound-constrained problems
 Status: converged, an optimal solution was found
 Value of the objective             6.51802E-02
 Norm of gradient                   2.57338E-07
 Norm of scaled gradient            7.12740E-07
 Norm of step                       1.56251E-05
 Primal variables:
   idx   Lower bound       Value       Upper bound
     1   3.00000E-01    3.00000E-01    3.00000E-01
     2       -inf       1.06039E+00         inf
     3       -inf       2.11765E-02         inf
     4       -inf       2.11749E-01         inf
     5       -inf       5.16415E+00         inf

マニュアル:

handle_solve_bxnlのマニュアル

ソース:

#!/usr/bin/env python3
"``naginterfaces.library.opt.handle_disable`` Python Example."

# nAG Copyright 2020-2022.

# pylint: disable=invalid-name

import numpy as np

from naginterfaces.base import utils
from naginterfaces.library import opt

def main():
    """
    Example for :func:`naginterfaces.library.opt.handle_disable`

    nAG Optimization Modelling suite: disabling a residual from a nonlinear
    least-square problem.

    >>> main()
    naginterfaces.library.opt.handle_disable Python Example Results
    First solve the problem with the outliers
    --------------------------------------------------------
     E04GG, Nonlinear least squares method for bound-constrained problems
     Status: converged, an optimal solution was found
     Value of the objective             1.05037E+00
     Norm of gradient                   8.78014E-06
     Norm of scaled gradient            6.05781E-06
     Norm of step                       1.47886E-01
    <BLANKLINE>
     Primal variables:
       idx   Lower bound       Value       Upper bound
         1       -inf        3.61301E-01        inf
         2       -inf        9.10227E-01        inf
         3       -inf        3.42138E-03        inf
         4       -inf       -6.08965E+00        inf
         5       -inf        6.24881E-04        inf
    <BLANKLINE>
    Now remove the outlier residuals from the problem handle
    <BLANKLINE>
     E04GG, Nonlinear least squares method for bound-constrained problems
     Status: converged, an optimal solution was found
     Value of the objective             5.96811E-02
     Norm of gradient                   1.19914E-06
     Norm of scaled gradient            3.47087E-06
     Norm of step                       3.49256E-06
    <BLANKLINE>
     Primal variables:
       idx   Lower bound       Value       Upper bound
         1       -inf        3.53888E-01        inf
         2       -inf        1.06575E+00        inf
         3       -inf        1.91383E-03        inf
         4       -inf        2.17299E-01        inf
         5       -inf        5.17660E+00        inf
    --------------------------------------------------------
    <BLANKLINE>
    Assuming the outliers points are measured again
    we can enable the residuals and adjust the values
    <BLANKLINE>
    --------------------------------------------------------
     E04GG, Nonlinear least squares method for bound-constrained problems
     Status: converged, an optimal solution was found
     Value of the objective             6.51802E-02
     Norm of gradient                   2.57338E-07
     Norm of scaled gradient            7.12740E-07
     Norm of step                       1.56251E-05
    <BLANKLINE>
     Primal variables:
       idx   Lower bound       Value       Upper bound
         1   3.00000E-01    3.00000E-01    3.00000E-01
         2       -inf       1.06039E+00         inf
         3       -inf       2.11765E-02         inf
         4       -inf       2.11749E-01         inf
         5       -inf       5.16415E+00         inf
    """

    print(
        'naginterfaces.library.opt.handle_disable Python Example Results'
    )

    # problem data
    # number of observations
    nres = 30
    # observations
    t = [-1.0000000000000000E+00, -9.3103448275862066E-01,
         -8.6206896551724133E-01, -7.9310344827586210E-01,
         -7.2413793103448276E-01, -6.5517241379310343E-01,
         -5.8620689655172420E-01, -5.1724137931034486E-01,
         -4.4827586206896552E-01, -3.7931034482758619E-01,
         -3.1034482758620685E-01, -2.4137931034482762E-01,
         -1.7241379310344829E-01, -1.0344827586206895E-01,
         -3.4482758620689724E-02, 3.4482758620689724E-02,
         1.0344827586206895E-01, 1.7241379310344818E-01,
         2.4137931034482762E-01, 3.1034482758620685E-01,
         3.7931034482758630E-01, 4.4827586206896552E-01,
         5.1724137931034475E-01, 5.8620689655172420E-01,
         6.5517241379310343E-01, 7.2413793103448265E-01,
         7.9310344827586210E-01, 8.6206896551724133E-01,
         9.3103448275862055E-01, 1.0000000000000000E+00]
    y = [-4.7355262950125371E-01, -5.4938194860076961E-01,
         -3.9825239170869919E-01, -3.8856020837234490E-01,
         -5.5342876546898057E-01, -4.9096203345353551E-01,
         -5.3572741683518588E-01, -4.7612745760879621E-01,
         -5.3500371923553647E-01, 6.0158102095753345E-01,
         -5.8735647599146978E-01, -4.4672492128166008E-01,
         -2.8302528479868733E-01, -3.0311152621895832E-01,
         -4.1648482978955480E-02, 3.7631282343379285E-02,
         1.6504195889737350E-01, 5.1964885199180544E-01,
         5.4702301251386876E-01, -4.4537928919142311E-01,
         5.9042453784268267E-01, 6.9016702395652996E-01,
         6.9143394337665087E-01, 7.8263998594593021E-01,
         8.1261679746103432E-01, 7.5887968830904762E-01,
         9.5316846217349571E-01, 1.1014159319048389E+00,
         1.0675894279571976E+00, 1.1538027971634699E+00]

    # Define the data structure to be passed to the callback functions
    data = {'t': t, 'y': y}

    # try to fit the model
    # f(t) = at^2 + bt + c + d sin(omega t)
    # To the data {(t_i, y_i)}
    nvar = 5

    # Initialize the Optimization model handle with the number of variables
    handle = opt.handle_init(nvar)

    # Define a dense nonlinear least-squares objective function
    opt.handle_set_nlnls(handle, nres)

    # Set some optional parameters to control the output of the solver
    for option in [
            'Print Options = NO',
            'Print Level = 1',
            'Print Solution = X',
            'Bxnl Iteration Limit = 100',
    ]:
        opt.handle_opt_set(handle, option)

    print('First solve the problem with the outliers')
    print('--------------------------------------------------------')

    # Use an explicit I/O manager for abbreviated iteration output:
    iom = utils.FileObjManager(locus_in_output=False)

    def lsqfun(x, nres, inform, data):
        """
        Objective function callback passed to the least squares solver.
        """
        rx = np.zeros(nres)
        t = data['t']
        y = data['y']
        for i in range(nres):
            rx[i] = (
                x[0]*t[i]**2 + x[1]*t[i] + x[2] + x[3]*np.sin(x[4]*t[i]) - y[i]
            )
        return rx, inform

    def lsqgrd(x, nres, rdx, inform, data):
        """
        Computes the Jacobian of the least square residuals.
        """
        n = len(x)
        t = data['t']
        for i in range(nres):
            rdx[i*n+0] = t[i]**2
            rdx[i*n+1] = t[i]
            rdx[i*n+2] = 1.0
            rdx[i*n+3] = np.sin(x[4]*t[i])
            rdx[i*n+4] = x[3]*t[i]*np.cos(x[4]*t[i])
        return inform

    # Call the solver
    x = np.ones(nvar, dtype=float)
    opt.handle_solve_bxnl(handle, lsqfun, lsqgrd, x, nres, data=data,
                          io_manager=iom)

    print()
    print('Now remove the outlier residuals from the problem handle')
    print()

    # Disable the two outlier residuals
    opt.handle_disable(handle, comp='NLS', idx=[10, 20])

    # Solve the problem again
    x = np.ones(nvar, dtype=float)
    opt.handle_solve_bxnl(handle, lsqfun, lsqgrd, x, nres, data=data,
                          io_manager=iom)


    print('--------------------------------------------------------')
    print()
    print('Assuming the outliers points are measured again')
    print('we can enable the residuals and adjust the values')
    print()
    print('--------------------------------------------------------')

    # Fix the first variable to its known value of 0.3
    # enable the disabled residuals and adjust the values in the data
    opt.handle_set_bound(handle, comp='X', idx=1, bli=0.3, bui=0.3)
    opt.handle_enable(handle, comp='NLS', idx=[10, 20])
    data['y'][9] = -0.515629
    data['y'][19] = 0.54920

    # Solve the problem again
    x = np.ones(nvar, dtype=float)
    opt.handle_solve_bxnl(handle, lsqfun, lsqgrd, x, nres, data=data,
                          io_manager=iom)

if __name__ == '__main__':
    import doctest
    import sys
    sys.exit(
        doctest.testmod(
            None, verbose=True, report=False,
            optionflags=doctest.ELLIPSIS | doctest.REPORT_NDIFF,
        ).failed
    )

関連情報
© 日本ニューメリカルアルゴリズムズグループ株式会社 2025
Privacy Policy  /  Trademarks