亚洲在线久爱草,狠狠天天香蕉网,天天搞日日干久草,伊人亚洲日本欧美

為了賬號安全,請及時綁定郵箱和手機立即綁定
  • iris數據集:

    5.13.51.40.2Iris-setosa
    4.931.40.2Iris-setosa
    4.73.21.30.2Iris-setosa
    4.63.11.50.2Iris-setosa
    53.61.40.2Iris-setosa
    5.43.91.70.4Iris-setosa
    4.63.41.40.3Iris-setosa
    53.41.50.2Iris-setosa
    4.42.91.40.2Iris-setosa
    4.93.11.50.1Iris-setosa
    5.43.71.50.2Iris-setosa
    4.83.41.60.2Iris-setosa
    4.831.40.1Iris-setosa
    4.331.10.1Iris-setosa
    5.841.20.2Iris-setosa
    5.74.41.50.4Iris-setosa
    5.43.91.30.4Iris-setosa
    5.13.51.40.3Iris-setosa
    5.73.81.70.3Iris-setosa
    5.13.81.50.3Iris-setosa
    5.43.41.70.2Iris-setosa
    5.13.71.50.4Iris-setosa
    4.63.610.2Iris-setosa
    5.13.31.70.5Iris-setosa
    4.83.41.90.2Iris-setosa
    531.60.2Iris-setosa
    53.41.60.4Iris-setosa
    5.23.51.50.2Iris-setosa
    5.23.41.40.2Iris-setosa
    4.73.21.60.2Iris-setosa
    4.83.11.60.2Iris-setosa
    5.43.41.50.4Iris-setosa
    5.24.11.50.1Iris-setosa
    5.54.21.40.2Iris-setosa
    4.93.11.50.1Iris-setosa
    53.21.20.2Iris-setosa
    5.53.51.30.2Iris-setosa
    4.93.11.50.1Iris-setosa
    4.431.30.2Iris-setosa
    5.13.41.50.2Iris-setosa
    53.51.30.3Iris-setosa
    4.52.31.30.3Iris-setosa
    4.43.21.30.2Iris-setosa
    53.51.60.6Iris-setosa
    5.13.81.90.4Iris-setosa
    4.831.40.3Iris-setosa
    5.13.81.60.2Iris-setosa
    4.63.21.40.2Iris-setosa
    5.33.71.50.2Iris-setosa
    53.31.40.2Iris-setosa
    73.24.71.4Iris-versicolor
    6.43.24.51.5Iris-versicolor
    6.93.14.91.5Iris-versicolor
    5.52.341.3Iris-versicolor
    6.52.84.61.5Iris-versicolor
    5.72.84.51.3Iris-versicolor
    6.33.34.71.6Iris-versicolor
    4.92.43.31Iris-versicolor
    6.62.94.61.3Iris-versicolor
    5.22.73.91.4Iris-versicolor
    523.51Iris-versicolor
    5.934.21.5Iris-versicolor
    62.241Iris-versicolor
    6.12.94.71.4Iris-versicolor
    5.62.93.61.3Iris-versicolor
    6.73.14.41.4Iris-versicolor
    5.634.51.5Iris-versicolor
    5.82.74.11Iris-versicolor
    6.22.24.51.5Iris-versicolor
    5.62.53.91.1Iris-versicolor
    5.93.24.81.8Iris-versicolor
    6.12.841.3Iris-versicolor
    6.32.54.91.5Iris-versicolor
    6.12.84.71.2Iris-versicolor
    6.42.94.31.3Iris-versicolor
    6.634.41.4Iris-versicolor
    6.82.84.81.4Iris-versicolor
    6.7351.7Iris-versicolor
    62.94.51.5Iris-versicolor
    5.72.63.51Iris-versicolor
    5.52.43.81.1Iris-versicolor
    5.52.43.71Iris-versicolor
    5.82.73.91.2Iris-versicolor
    62.75.11.6Iris-versicolor
    5.434.51.5Iris-versicolor
    63.44.51.6Iris-versicolor
    6.73.14.71.5Iris-versicolor
    6.32.34.41.3Iris-versicolor
    5.634.11.3Iris-versicolor
    5.52.541.3Iris-versicolor
    5.52.64.41.2Iris-versicolor
    6.134.61.4Iris-versicolor
    5.82.641.2Iris-versicolor
    52.33.31Iris-versicolor
    5.62.74.21.3Iris-versicolor
    5.734.21.2Iris-versicolor
    5.72.94.21.3Iris-versicolor
    6.22.94.31.3Iris-versicolor
    5.12.531.1Iris-versicolor
    5.72.84.11.3Iris-versicolor
    6.33.362.5Iris-virginica
    5.82.75.11.9Iris-virginica
    7.135.92.1Iris-virginica
    6.32.95.61.8Iris-virginica
    6.535.82.2Iris-virginica
    7.636.62.1Iris-virginica
    4.92.54.51.7Iris-virginica
    7.32.96.31.8Iris-virginica
    6.72.55.81.8Iris-virginica
    7.23.66.12.5Iris-virginica
    6.53.25.12Iris-virginica
    6.42.75.31.9Iris-virginica
    6.835.52.1Iris-virginica
    5.72.552Iris-virginica
    5.82.85.12.4Iris-virginica
    6.43.25.32.3Iris-virginica
    6.535.51.8Iris-virginica
    7.73.86.72.2Iris-virginica
    7.72.66.92.3Iris-virginica
    62.251.5Iris-virginica
    6.93.25.72.3Iris-virginica
    5.62.84.92Iris-virginica
    7.72.86.72Iris-virginica
    6.32.74.91.8Iris-virginica
    6.73.35.72.1Iris-virginica
    7.23.261.8Iris-virginica
    6.22.84.81.8Iris-virginica
    6.134.91.8Iris-virginica
    6.42.85.62.1Iris-virginica
    7.235.81.6Iris-virginica
    7.42.86.11.9Iris-virginica
    7.93.86.42Iris-virginica
    6.42.85.62.2Iris-virginica
    6.32.85.11.5Iris-virginica
    6.12.65.61.4Iris-virginica
    7.736.12.3Iris-virginica
    6.33.45.62.4Iris-virginica
    6.43.15.51.8Iris-virginica
    634.81.8Iris-virginica
    6.93.15.42.1Iris-virginica
    6.73.15.62.4Iris-virginica
    6.93.15.12.3Iris-virginica
    5.82.75.11.9Iris-virginica
    6.83.25.92.3Iris-virginica
    6.73.35.72.5Iris-virginica
    6.735.22.3Iris-virginica
    6.32.551.9Iris-virginica
    6.535.22Iris-virginica
    6.23.45.42.3Iris-virginica
    5.935.11.8Iris-virginica


    查看全部
  • import?numpy?as?np
    class?Perceptron(object):
    ????"""????
    ????eta:學習率????
    ????n_iter:權重向量的訓練次數????
    ????w_:神經分叉權重向量????
    ????errors:用于記錄神經元判斷出錯次數????
    ????"""
    ????def?__init__(self,?eta,?n_iter):????????
    ????????self.eta=eta????????
    ????????self.n_iter=n_iter????????
    ????????pass
    ????def?net_input(self,X):????????????
    ????????"""????????????
    ????????z?=?W0*1?+?W1*X1?+....?Wn*Xn????????????
    ????????"""????????????
    ????????return?np.dot(X,self.w_[1:])?+?self.w_[0]????????????
    ????????pass
    ????
    ????def?predict(self,?X):????????
    ????????return?np.where(self.net_input(X)?>=?0.0?,1,?-1)????????
    ????????pass
    ????def?fit(self,X,y):????????
    ????????"""????????
    ????????輸入訓練數據,訓練神經元,x輸入樣本向量,y對應樣本的正確分類????????????????
    ????????X:shape[n_samples,?n_features]????????
    ????????eg:X:[[1,?2,?,3],[4,?5,?6]]???????????
    ???????????n_samples:?2???????????
    ???????????n_features:?3??????????????????????
    ???????????y:[1,?-1]????????
    ????????"""????????
    ????????"""????????
    ????????初始化權重向量為0????????
    ????????加一是因為前面算法提到的w0,也就是步調函數閾值????????
    ????????"""????????
    ????????self.w_?=?np.zeros(1+X.shape[1])????????
    ????????self.errors_?=[]????????????????
    ????????
    ????????for?_?in?range(self.n_iter):????????????
    ????????????errors?=?0????????????
    ????????????"""????????????
    ????????????X:[[1,2,3],[4,5,6]]????????????
    ????????????y:[1,?-1]????????????
    ????????????zip(X,y)=[[1,2,3,?1],[4,5,6,?-1]]????????????
    ????????????"""????????????
    ????????????for?xi,?target?in?zip(X,y):?????????????????
    ????????????????"""????????????????
    ????????????????update?=?學習率?*?(y-y')????????????????
    ????????????????"""????????????????
    ????????????????update?=?self.eta?*?(target?-?self.predict(xi))????????????????
    ????????????????"""????????????????
    ????????????????xi?是一個向量????????????????
    ????????????????update?*?xi?等價:????????????????
    ????????????????[更新w(1)?=?X[1]*update,?更新w(2)?=?X[2]*update,更新w(3)?=?X[3]*update,]????????????????
    ????????????????"""????????????????
    ????????????????self.w_[1:]+=update?*?xi????????????????
    ????????????????self.w_[0]?+=?update????????????????????????????????
    ????????????????errors?+=?int(update?!=?0.0)????????????????
    ????????????????self.errors_.append(errors)????????????????
    ????????????????pass????????????????????????
    ????????????pass???
    ????pass
    
    
    file?=?"D:/PyCharm_test_file/Jupyter_test/iris1.xlsx"
    import?pandas?as?pd
    
    df?=?pd.read_excel(file,header=None)
    #df.head(10)
    #print(df)??
    
    
    import?matplotlib.pyplot?as?plt
    import?numpy?as?npy?=?df.loc[0:99,?4].values
    y?=?np.where(y?==?'Iris-setosa',?-1,?1)
    #print(y)
    X?=?df.iloc[0:100,?[0,?2]].values
    #print(X)
    plt.scatter(X[:50,?0],?X[:50,?1],?color='red',?marker='o',?label='setosa')
    plt.scatter(X[50:100,?0],?X[50:100,?1],?color='blue',?marker='x',?label='vericolor')
    plt.rcParams['font.sans-serif']=['SimHei']
    plt.xlabel('花瓣長度')
    plt.ylabel('花莖長度')
    plt.legend(loc='upper?left')
    plt.show()??
    
    
    
    from?matplotlib.colors?import?ListedColormap
    def?plot_decision_regions(X,?y,?classifier,?resolution=0.02):????
    ????marker?=?('s',?'x',?'o',?'v')????
    ????colors?=?('red',?'blue',?'lightgreen',?'gray',?'cyan')????
    ????cmap?=?ListedColormap(colors[:len(np.unique(y))])????
    ????x1_min,?x1_max?=?X[:,?0].min()?-?1,?X[:,?0].max()????
    ????x2_min,?x2_max?=?X[:,?1].min()?-?1,?X[:,?1].max()????
    ????#將x1、x2最大最小值通過arange函數得到的向量,擴展成兩個二維矩陣????
    ????xx1,?xx2?=?np.meshgrid(np.arange(x1_min,?x1_max,?resolution),?np.arange(x2_min,?x2_max,?resolution))????
    ????#預測????
    ????Z?=?classifier.predict(np.array([xx1.ravel(),?xx2.ravel()]).T)?#ravel還原成單維向量????
    ????#繪制????
    ????Z=?Z.reshape(xx1.shape)?#將Z轉換成與xx1一樣的二維數組????
    ????plt.contourf(xx1,?xx2,?Z,?alpha=0.4,?cmap=cmap)?#在兩組分類結果中間畫分割線-->必須線性可分????
    ????plt.xlim(xx1.min(),?xx1.max())????
    ????plt.ylim(xx2.min(),?xx2.max())????
    ????for?idx,?cl?in?enumerate(np.unique(y)):????????
    ????????plt.scatter(x=X[y==cl,?0],?y=X[y==cl,?1],?alpha=0.8,?c=cmap(idx),?marker=marker[idx],?label=cl)
    ????????
    ????????
    ppn?=?Perceptron(0.1,?10)
    ppn.fit(X,?y)
    plot_decision_regions(X,?y,?ppn,?resolution=0.02)


    查看全部
  • 自適應線性神經元的分類算法

    class?AdalineGD(object):
    ????def?__init__(self,?eta=0.1,?n_iter=50):
    ????????self.eta?=?eta
    ????????self.n_iter?=?n_iter
    ????def?fit(self,?X,?y):?#與感知器最大的不同
    ????????self.w_?=?np.zeros(1?+?X.shape[1])
    ????????self.cost_?=?[]?#成本函數:判斷改進效果,使其不斷減小
    ????????for?i?in?range(self.n_iter):
    ????????????output?=?self.net_input(X)
    ????????????errors?=?y?-?output
    ????????????self.w_[1:]?+=?self.eta?*?X.T.dot(errors)
    ????????????self.w_[0]?+=?self.eta?*?errors.sum()
    ????????????cost?=?(errors?**?2).sum()/2.0
    ????????????self.cost_.append(cost)
    ????????return?self
    ????def?net_input(self,?X):
    ????????return?np.dot(X,?self.w_[1:]?+?self.w_[0])
    ????def?activation(self,?X):
    ????????return?self.net_input(self,?X):
    ????def?predict(self,?X):
    ????????return?np.where(self.activation(X)?>=?0,?1,?-1)
    
    #運行算法???????????????
    ada?=?AdalineGD(eta=0.0001,?n_iter=50)?#學習率越低,每次權重的改進越精確;迭代次數越大,優化的結果越準確。
    ada.fit(X,?y)
    plot_decision_regions(X,?y,?classifier=ada)?#預測數據輸入到神經網絡后預測
    plt.title('Adaline-Gradient?descent')
    plt.xlabel('花莖長度')
    plt.ylabel('花瓣長度')
    plt.legend(loc='upper?left')
    plt.show()
    
    plt.plot(range(1,?len(ada.cost_)+1),?ada.cost_,?marker='o')?#檢測改進效果
    plt.xlabel('Epochs')?#自我迭代的學習次數
    plt.ylabel('sum-squard-error')?#做出錯誤判斷的次數
    查看全部
  • 感知器的激活函數是步調函數,當輸入數據大于閾值-->1,小于閾值-->0.

    自適應線性神經元的激活函數是直接將輸入數據與參數相乘后求和:w0+x1w1+x2w2+……,并且會將計算結果與輸入結果進行比較,若不一致,會根據給定結果動態調整參數-->漸進下降法。

    漸進下降法:

    和方差函數(U型),當函數對w求偏導后得到的切線斜率小于0(大于0)-->增大(減?。┫鄳窠浽獏祑值


    查看全部
  • 數據解析和可視化

    import?pandas?as?pd
    import?matplotlib.pyplot?as?plt
    import?numpy?as?np
    
    df?=?pd.read_csv(file,?header=None)
    y?=?df.loc[0:100,?4].values
    y?=?np.where(y?==?'Iris-setosa',?-1,?1)
    X?=?df.iloc[0:100,?[0,?2]].values
    plt.scatter(X[:50,?0],?X[:50,?1],?color='red',?marker='o',?label='setosa')
    plt.scatter(X[50:100,?0],?X[50:100,?1],?color='blue',?marker='x',?label='versicolor')
    plt.xlabel('花瓣長度')
    plt.ylabel('花莖長度')
    plt.legend(loc='upper?left')
    plt.show()

    數據分類(將預測的數據輸入到神經網絡中,以圖形方式繪制)

    from?matplotlib.colors?import?ListedColormap
    def?plot_decision_regions(X,?y,?classifier,?resolution=0.02):
    ????marker?=?('s',?'x',?'o',?'v')
    ????colors?=?('red',?'blue',?'lightgreen',?'gray',?'cyan')
    ????cmap?=?ListedColormap(colors[:len(np.unique(y))])
    ????x1_min,?x1_max?=?X[:,?0].min()?-?1,?X[:,?0].max()
    ????x2_min,?x2_max?=?X[:,?1].min()?-?1,?X[:,?1].max()
    ????#將x1、x2最大最小值通過arange函數得到的向量,擴展成兩個二維矩陣
    ????xx1,?xx2?=?np.meshgrid(np.arange(x1_min,?x1_max,?resolution),?np.arange(x2_min,?x2_max,?resolution))
    ????#預測
    ????Z?=?classifier.predict(np.array([xx1.ravel(),?xx2.ravel()]).T)?#ravel還原成單維向量
    ????#繪制
    ????Z=?Z.reshape(xx1.shape)?#將Z轉換成與xx1一樣的二維數組
    ????plt.contourf(xx1,?xx2,?Z,?alpha=0.4,?cmap=cmap)?#在兩組分類結果中間畫分割線-->必須線性可分
    ????plt.xlim(xx1.min(),?xx1.max())
    ????plt.ylim(xx2.min(),?xx2.max())
    ????for?idx,?cl?in?enumerate(np.unique(y)):
    ????????plt.scatter(x=X[y==cl,?0],?y=X[y==cl,?1],?alpha=0.8,?c=cmap(idx),?marker=markers[idx],?label=cl)
    查看全部
  • 感知器的分類算法

    import?numpy?as?mp
    class?Perception(objet):
    ????def?__init__(self,?eta=0.01,?n_ier=10):?#eta學習率,n_iter訓練權重向量的重復次數
    ????????self.eta?=?eta
    ????????self.n_iter?=?n_iter
    ????#根據輸入樣本培訓神經元,X:shape[n_sample,?n_feature],?n_feature指神經元接收的電信號X數量
    ????def?fit(self,?X,?y):?
    ????????self.w_?=?np.zero(1+X.shape[1])?#初始權重向量w_為0,括號里加一是包括激勵函數的閾值w0
    ????????self.errors_?=?[]?#errors_用于記錄神經元判斷出錯次數,是個隊列
    ????????
    ????????for?_?in?range(self.n_iter):
    ????????????errors?=?0
    ????????????for?xi,?target?in?zip(X,?y):
    ????????????????#權重更新
    ????????????????update?=?self.eta?*?(target?-?self.predict(xi))
    ????????????????self.w_[1:]?+=?update?*?xi
    ????????????????#閾值更新
    ????????????????self.w_[0]?+=?update
    ????????????????
    ????????????????errors?+=?int(update?!=?0.0)
    ????????????????self.errors_.append(errors)
    ?????????
    ????????def?net_input(self,?X):?#輸入電信號X與權重w的點積
    ????????????return?np.dot(X,?self.w_[1:])?+?self.w_[0]
    ????????????
    ????????def?predict(self,?X):?#預測分類
    ????????????return?np.where(self.net_input(X)?>=?0.0,?1,?-1)


    查看全部
  • 感知器算法:要求數據線性可分

    查看全部
  • ????機器學習中用于對數據進行分類的算法:1.感知器。2.適應性的線性神經元。

    ????激活函數(也叫單元步調函數)

    ????機器學習的本質是模擬人的神經元對信息的處理方法。根據神經學的研究,神經元可以看做是一個簡單的帶有二進制輸出功能的邏輯電路門。

    查看全部
  • 自適應線性神經元

    查看全部
  • 神經元的數學表示

    激活函數

    向量的轉置與乘積的表達式

    查看全部
  • 課程大綱

    查看全部
    0 采集 收起 來源:課程的開篇詞

    2018-04-19

  • 算法步驟總結

    查看全部
  • 感知器數據分類算法步驟

    查看全部
  • 真反饋電路
    查看全部
  • 更新權重

    查看全部

舉報

0/150
提交
取消
課程須知
有一定的編程基礎,例如掌握C語言。
老師告訴你能學到什么?
1、機器學習的基本概念介紹 2、數據分類算法介紹 3、神經元感知器分類算法并進行實現 4、數據解析和可視化設計 5、使用數據訓練神經網絡 6、如何使用訓練后的神經網絡分類數據 7、適應性線性神經元的基本原理并實現 8、適應性線性神經元網絡進行數據分類

微信掃碼,參與3人拼團

微信客服

購課補貼
聯系客服咨詢優惠詳情

幫助反饋 APP下載

慕課網APP
您的移動學習伙伴

公眾號

掃描二維碼
關注慕課網微信公眾號

友情提示:

您好,此課程屬于遷移課程,您已購買該課程,無需重復購買,感謝您對慕課網的支持!